Oct 05 20:14:49 crc systemd[1]: Starting Kubernetes Kubelet... Oct 05 20:14:50 crc restorecon[4633]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:50 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 05 20:14:51 crc restorecon[4633]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 05 20:14:51 crc restorecon[4633]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 05 20:14:51 crc kubenswrapper[4753]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 05 20:14:51 crc kubenswrapper[4753]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 05 20:14:51 crc kubenswrapper[4753]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 05 20:14:51 crc kubenswrapper[4753]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 05 20:14:51 crc kubenswrapper[4753]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 05 20:14:51 crc kubenswrapper[4753]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.579689 4753 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585047 4753 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585081 4753 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585092 4753 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585101 4753 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585110 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585118 4753 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585128 4753 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585193 4753 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585206 4753 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585218 4753 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585228 4753 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585237 4753 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585247 4753 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585256 4753 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585265 4753 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585273 4753 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585281 4753 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585289 4753 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585297 4753 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585305 4753 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585312 4753 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585320 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585340 4753 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585348 4753 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585355 4753 feature_gate.go:330] unrecognized feature gate: Example Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585363 4753 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585370 4753 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585378 4753 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585386 4753 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585393 4753 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585401 4753 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585408 4753 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585418 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585426 4753 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585434 4753 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585441 4753 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585449 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585456 4753 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585464 4753 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585475 4753 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585483 4753 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585491 4753 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585499 4753 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585508 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585515 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585523 4753 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585531 4753 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585539 4753 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585547 4753 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585554 4753 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585561 4753 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585569 4753 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585577 4753 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585585 4753 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585592 4753 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585601 4753 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585608 4753 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585616 4753 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585625 4753 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585634 4753 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585645 4753 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585654 4753 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585666 4753 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585674 4753 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585682 4753 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585690 4753 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585700 4753 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585709 4753 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585718 4753 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585728 4753 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.585739 4753 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586767 4753 flags.go:64] FLAG: --address="0.0.0.0" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586793 4753 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586811 4753 flags.go:64] FLAG: --anonymous-auth="true" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586825 4753 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586837 4753 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586847 4753 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586859 4753 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586871 4753 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586881 4753 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586890 4753 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586900 4753 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586910 4753 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586919 4753 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586928 4753 flags.go:64] FLAG: --cgroup-root="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586937 4753 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586946 4753 flags.go:64] FLAG: --client-ca-file="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586956 4753 flags.go:64] FLAG: --cloud-config="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586965 4753 flags.go:64] FLAG: --cloud-provider="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586975 4753 flags.go:64] FLAG: --cluster-dns="[]" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586986 4753 flags.go:64] FLAG: --cluster-domain="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.586995 4753 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587004 4753 flags.go:64] FLAG: --config-dir="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587012 4753 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587023 4753 flags.go:64] FLAG: --container-log-max-files="5" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587035 4753 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587044 4753 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587053 4753 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587062 4753 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587071 4753 flags.go:64] FLAG: --contention-profiling="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587080 4753 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587089 4753 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587099 4753 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587108 4753 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587119 4753 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587130 4753 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587166 4753 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587175 4753 flags.go:64] FLAG: --enable-load-reader="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587184 4753 flags.go:64] FLAG: --enable-server="true" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587193 4753 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587205 4753 flags.go:64] FLAG: --event-burst="100" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587215 4753 flags.go:64] FLAG: --event-qps="50" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587223 4753 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587233 4753 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587241 4753 flags.go:64] FLAG: --eviction-hard="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587252 4753 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587261 4753 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587270 4753 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587280 4753 flags.go:64] FLAG: --eviction-soft="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587288 4753 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587297 4753 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587307 4753 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587316 4753 flags.go:64] FLAG: --experimental-mounter-path="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587324 4753 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587333 4753 flags.go:64] FLAG: --fail-swap-on="true" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587342 4753 flags.go:64] FLAG: --feature-gates="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587354 4753 flags.go:64] FLAG: --file-check-frequency="20s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587363 4753 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587372 4753 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587382 4753 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587392 4753 flags.go:64] FLAG: --healthz-port="10248" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587401 4753 flags.go:64] FLAG: --help="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587410 4753 flags.go:64] FLAG: --hostname-override="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587419 4753 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587428 4753 flags.go:64] FLAG: --http-check-frequency="20s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587438 4753 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587446 4753 flags.go:64] FLAG: --image-credential-provider-config="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587456 4753 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587465 4753 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587474 4753 flags.go:64] FLAG: --image-service-endpoint="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587483 4753 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587491 4753 flags.go:64] FLAG: --kube-api-burst="100" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587501 4753 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587510 4753 flags.go:64] FLAG: --kube-api-qps="50" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587519 4753 flags.go:64] FLAG: --kube-reserved="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587528 4753 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587536 4753 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587546 4753 flags.go:64] FLAG: --kubelet-cgroups="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587555 4753 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587564 4753 flags.go:64] FLAG: --lock-file="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587572 4753 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587581 4753 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587590 4753 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587604 4753 flags.go:64] FLAG: --log-json-split-stream="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587613 4753 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587623 4753 flags.go:64] FLAG: --log-text-split-stream="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587632 4753 flags.go:64] FLAG: --logging-format="text" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587640 4753 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587650 4753 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587659 4753 flags.go:64] FLAG: --manifest-url="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587668 4753 flags.go:64] FLAG: --manifest-url-header="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587680 4753 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587689 4753 flags.go:64] FLAG: --max-open-files="1000000" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587701 4753 flags.go:64] FLAG: --max-pods="110" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587710 4753 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587718 4753 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587728 4753 flags.go:64] FLAG: --memory-manager-policy="None" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587738 4753 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587748 4753 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587757 4753 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587766 4753 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587787 4753 flags.go:64] FLAG: --node-status-max-images="50" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587796 4753 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587806 4753 flags.go:64] FLAG: --oom-score-adj="-999" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587815 4753 flags.go:64] FLAG: --pod-cidr="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587824 4753 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587837 4753 flags.go:64] FLAG: --pod-manifest-path="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587846 4753 flags.go:64] FLAG: --pod-max-pids="-1" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587856 4753 flags.go:64] FLAG: --pods-per-core="0" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587865 4753 flags.go:64] FLAG: --port="10250" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587873 4753 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587882 4753 flags.go:64] FLAG: --provider-id="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587891 4753 flags.go:64] FLAG: --qos-reserved="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587900 4753 flags.go:64] FLAG: --read-only-port="10255" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587909 4753 flags.go:64] FLAG: --register-node="true" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587918 4753 flags.go:64] FLAG: --register-schedulable="true" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587927 4753 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587942 4753 flags.go:64] FLAG: --registry-burst="10" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587952 4753 flags.go:64] FLAG: --registry-qps="5" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587963 4753 flags.go:64] FLAG: --reserved-cpus="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587972 4753 flags.go:64] FLAG: --reserved-memory="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587984 4753 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.587993 4753 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588003 4753 flags.go:64] FLAG: --rotate-certificates="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588013 4753 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588022 4753 flags.go:64] FLAG: --runonce="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588031 4753 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588040 4753 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588049 4753 flags.go:64] FLAG: --seccomp-default="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588058 4753 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588067 4753 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588076 4753 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588086 4753 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588095 4753 flags.go:64] FLAG: --storage-driver-password="root" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588105 4753 flags.go:64] FLAG: --storage-driver-secure="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588114 4753 flags.go:64] FLAG: --storage-driver-table="stats" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588123 4753 flags.go:64] FLAG: --storage-driver-user="root" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588132 4753 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588165 4753 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588174 4753 flags.go:64] FLAG: --system-cgroups="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588183 4753 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588197 4753 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588205 4753 flags.go:64] FLAG: --tls-cert-file="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588226 4753 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588236 4753 flags.go:64] FLAG: --tls-min-version="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588245 4753 flags.go:64] FLAG: --tls-private-key-file="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588254 4753 flags.go:64] FLAG: --topology-manager-policy="none" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588263 4753 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588271 4753 flags.go:64] FLAG: --topology-manager-scope="container" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588280 4753 flags.go:64] FLAG: --v="2" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588293 4753 flags.go:64] FLAG: --version="false" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588305 4753 flags.go:64] FLAG: --vmodule="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588316 4753 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.588327 4753 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588593 4753 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588604 4753 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588613 4753 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588621 4753 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588629 4753 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588637 4753 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588645 4753 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588653 4753 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588661 4753 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588669 4753 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588676 4753 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588685 4753 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588693 4753 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588700 4753 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588709 4753 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588717 4753 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588725 4753 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588733 4753 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588741 4753 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588748 4753 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588756 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588764 4753 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588772 4753 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588780 4753 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588787 4753 feature_gate.go:330] unrecognized feature gate: Example Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588798 4753 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588808 4753 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588831 4753 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588841 4753 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588851 4753 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588861 4753 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588869 4753 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588878 4753 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588887 4753 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588895 4753 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588903 4753 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588911 4753 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588920 4753 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588928 4753 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588936 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588944 4753 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588951 4753 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588959 4753 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588967 4753 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588975 4753 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588983 4753 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588991 4753 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.588998 4753 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589006 4753 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589014 4753 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589022 4753 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589030 4753 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589037 4753 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589045 4753 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589053 4753 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589060 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589068 4753 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589076 4753 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589083 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589097 4753 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589107 4753 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589116 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589124 4753 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589133 4753 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589173 4753 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589182 4753 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589191 4753 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589201 4753 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589211 4753 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589219 4753 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.589228 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.590322 4753 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.603482 4753 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.603542 4753 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603708 4753 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603726 4753 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603734 4753 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603743 4753 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603751 4753 feature_gate.go:330] unrecognized feature gate: Example Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603762 4753 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603770 4753 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603778 4753 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603785 4753 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603794 4753 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603802 4753 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603812 4753 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603822 4753 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603830 4753 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603839 4753 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603847 4753 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603855 4753 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603863 4753 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603871 4753 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603879 4753 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603887 4753 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603895 4753 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603903 4753 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603910 4753 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603918 4753 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603926 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603934 4753 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603942 4753 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603950 4753 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603959 4753 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603971 4753 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603987 4753 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.603997 4753 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604005 4753 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604014 4753 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604024 4753 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604033 4753 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604040 4753 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604048 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604056 4753 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604065 4753 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604076 4753 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604087 4753 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604097 4753 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604111 4753 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604123 4753 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604171 4753 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604187 4753 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604201 4753 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604213 4753 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604224 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604235 4753 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604244 4753 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604445 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604477 4753 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604488 4753 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604498 4753 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604508 4753 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604545 4753 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604555 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604563 4753 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604571 4753 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604634 4753 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604644 4753 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604652 4753 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604660 4753 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604668 4753 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604678 4753 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604688 4753 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604697 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604705 4753 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.604720 4753 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604969 4753 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.604989 4753 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605001 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605013 4753 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605025 4753 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605039 4753 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605050 4753 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605060 4753 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605072 4753 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605083 4753 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605093 4753 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605103 4753 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605113 4753 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605123 4753 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605167 4753 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605178 4753 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605186 4753 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605195 4753 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605205 4753 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605273 4753 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605287 4753 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605299 4753 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605308 4753 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605317 4753 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605326 4753 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605335 4753 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605343 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605353 4753 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605362 4753 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605372 4753 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605382 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605395 4753 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605407 4753 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605418 4753 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605428 4753 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605438 4753 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605451 4753 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605463 4753 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605474 4753 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605483 4753 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605491 4753 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605503 4753 feature_gate.go:330] unrecognized feature gate: Example Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605511 4753 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605519 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605528 4753 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605536 4753 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605544 4753 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605593 4753 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605605 4753 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605614 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605623 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605632 4753 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605640 4753 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605648 4753 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605659 4753 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605669 4753 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605680 4753 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605691 4753 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605701 4753 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605711 4753 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605719 4753 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605726 4753 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605735 4753 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605762 4753 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605771 4753 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605781 4753 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605790 4753 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605800 4753 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605811 4753 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605820 4753 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.605830 4753 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.605844 4753 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.606124 4753 server.go:940] "Client rotation is on, will bootstrap in background" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.617474 4753 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.617643 4753 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.621863 4753 server.go:997] "Starting client certificate rotation" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.621917 4753 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.623639 4753 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-21 07:12:09.765212113 +0000 UTC Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.623814 4753 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1114h57m18.141403865s for next certificate rotation Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.648194 4753 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.651152 4753 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.665264 4753 log.go:25] "Validated CRI v1 runtime API" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.704830 4753 log.go:25] "Validated CRI v1 image API" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.707628 4753 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.717545 4753 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-05-20-01-58-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.717597 4753 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.745622 4753 manager.go:217] Machine: {Timestamp:2025-10-05 20:14:51.741719097 +0000 UTC m=+0.590047419 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2799886 MemoryCapacity:25199484928 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5031572e-89d6-40ea-86fd-ab9d0632be0c BootID:6fe6c287-7fce-4f83-8f74-f3c461744d43 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076109 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599742464 Type:vfs Inodes:3076109 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:bb:c9:d9 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:bb:c9:d9 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:66:49:05 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:2c:3a:69 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:7a:6c:72 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:eb:16:24 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:5c:88:1b Speed:-1 Mtu:1496} {Name:eth10 MacAddress:22:1c:a7:d3:27:2a Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:6e:56:a7:b7:cb:80 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199484928 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.746054 4753 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.746341 4753 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.748597 4753 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.748864 4753 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.748899 4753 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.749231 4753 topology_manager.go:138] "Creating topology manager with none policy" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.749247 4753 container_manager_linux.go:303] "Creating device plugin manager" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.749771 4753 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.749806 4753 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.750010 4753 state_mem.go:36] "Initialized new in-memory state store" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.750101 4753 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.753995 4753 kubelet.go:418] "Attempting to sync node with API server" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.754026 4753 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.754120 4753 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.754153 4753 kubelet.go:324] "Adding apiserver pod source" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.754168 4753 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.758983 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 05 20:14:51 crc kubenswrapper[4753]: E1005 20:14:51.759393 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.759073 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 05 20:14:51 crc kubenswrapper[4753]: E1005 20:14:51.759429 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.761302 4753 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.762785 4753 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.764020 4753 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.765895 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.765952 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.765962 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.765970 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.765982 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.765990 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.765999 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.766011 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.766020 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.766029 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.766040 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.766048 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.767267 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.767699 4753 server.go:1280] "Started kubelet" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.768607 4753 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.768606 4753 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 05 20:14:51 crc systemd[1]: Started Kubernetes Kubelet. Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.769233 4753 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.769459 4753 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.769664 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.769689 4753 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.769870 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 07:34:50.526956838 +0000 UTC Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.769924 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1115h19m58.757035855s for next certificate rotation Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.771989 4753 server.go:460] "Adding debug handlers to kubelet server" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.772018 4753 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.772562 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 05 20:14:51 crc kubenswrapper[4753]: E1005 20:14:51.772619 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 05 20:14:51 crc kubenswrapper[4753]: E1005 20:14:51.772706 4753 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.772004 4753 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.773244 4753 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.773513 4753 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.773528 4753 factory.go:55] Registering systemd factory Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.773537 4753 factory.go:221] Registration of the systemd container factory successfully Oct 05 20:14:51 crc kubenswrapper[4753]: E1005 20:14:51.773592 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="200ms" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.773759 4753 factory.go:153] Registering CRI-O factory Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.773775 4753 factory.go:221] Registration of the crio container factory successfully Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.773794 4753 factory.go:103] Registering Raw factory Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.773807 4753 manager.go:1196] Started watching for new ooms in manager Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.774353 4753 manager.go:319] Starting recovery of all containers Oct 05 20:14:51 crc kubenswrapper[4753]: E1005 20:14:51.775704 4753 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186bb18b6abd50bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-05 20:14:51.76767302 +0000 UTC m=+0.616001262,LastTimestamp:2025-10-05 20:14:51.76767302 +0000 UTC m=+0.616001262,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.799751 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.799842 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.799859 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.799894 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.799910 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.799924 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.799939 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.799972 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.799992 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800008 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800021 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800054 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800084 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800103 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800189 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800205 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800218 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800241 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800295 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800309 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800323 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800358 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800374 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800389 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800403 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800436 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800452 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800465 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800527 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800544 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.800559 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.801877 4753 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.801931 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.801952 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.801985 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802001 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802015 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802030 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802074 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802108 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802123 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802225 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802257 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802309 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802343 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802449 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802480 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802496 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802508 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802523 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802536 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802570 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802585 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802603 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802619 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802655 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802668 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802681 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802695 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802727 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802742 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802755 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802769 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802800 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802813 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802827 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802840 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802853 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802890 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802910 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802938 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.802976 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803038 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803054 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803067 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803085 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803119 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803176 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803193 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803206 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803220 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803256 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803271 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803285 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803510 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803526 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803540 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803556 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803590 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803603 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803617 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803634 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803669 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803686 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803700 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803716 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803749 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803765 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803779 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803792 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803825 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803838 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803853 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803867 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803900 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803927 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803944 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803980 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.803998 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.804009 4753 manager.go:324] Recovery completed Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.804013 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805473 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805551 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805583 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805610 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805634 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805658 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805678 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805702 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805724 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805743 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805762 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805784 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805803 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805823 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805841 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805862 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805886 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805909 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805930 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805950 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805971 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.805995 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806017 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806037 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806060 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806082 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806106 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806129 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806175 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806198 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806220 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806241 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806263 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806287 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806307 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806328 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806349 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806369 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806390 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806412 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806446 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806467 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806490 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806514 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806535 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806558 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806583 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806606 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806629 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806652 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806675 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806696 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806718 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806743 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806794 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806816 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806836 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806859 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806880 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806902 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806924 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806949 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806970 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.806991 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807012 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807035 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807054 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807077 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807097 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807119 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807173 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807196 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807218 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807238 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807260 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807281 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807303 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807324 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807348 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807369 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807391 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807414 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807434 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807455 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807476 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807498 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807517 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807537 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807558 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807577 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807601 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807624 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807643 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807662 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807682 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807701 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807721 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807743 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807764 4753 reconstruct.go:97] "Volume reconstruction finished" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.807780 4753 reconciler.go:26] "Reconciler: start to sync state" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.819818 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.823808 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.823868 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.823890 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.825470 4753 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.825493 4753 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.825520 4753 state_mem.go:36] "Initialized new in-memory state store" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.838631 4753 policy_none.go:49] "None policy: Start" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.839913 4753 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.841063 4753 state_mem.go:35] "Initializing new in-memory state store" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.847098 4753 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.850746 4753 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.850794 4753 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.850831 4753 kubelet.go:2335] "Starting kubelet main sync loop" Oct 05 20:14:51 crc kubenswrapper[4753]: E1005 20:14:51.850898 4753 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 05 20:14:51 crc kubenswrapper[4753]: W1005 20:14:51.853832 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 05 20:14:51 crc kubenswrapper[4753]: E1005 20:14:51.853958 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 05 20:14:51 crc kubenswrapper[4753]: E1005 20:14:51.873072 4753 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.906185 4753 manager.go:334] "Starting Device Plugin manager" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.906245 4753 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.906260 4753 server.go:79] "Starting device plugin registration server" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.906679 4753 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.906699 4753 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.907095 4753 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.907214 4753 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.907228 4753 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 05 20:14:51 crc kubenswrapper[4753]: E1005 20:14:51.916564 4753 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.950980 4753 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.951052 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.952462 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.952544 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.952562 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.953645 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.953690 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.954275 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.954629 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.954671 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.954688 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.956898 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.958102 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.958132 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.958352 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.958467 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.958524 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.959272 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.959301 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.959313 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.959465 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.959643 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.959694 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.960954 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.960986 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.961001 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.961046 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.961063 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.961073 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.961376 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.961416 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.961432 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.961624 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.961754 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.961801 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.962748 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.962881 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.962994 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.963179 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.963443 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.963295 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.963711 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.963731 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.964338 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.964369 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:51 crc kubenswrapper[4753]: I1005 20:14:51.964381 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:51 crc kubenswrapper[4753]: E1005 20:14:51.974992 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="400ms" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.006996 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.008083 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.008113 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.008122 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.008160 4753 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 05 20:14:52 crc kubenswrapper[4753]: E1005 20:14:52.008587 4753 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.012810 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.012838 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.012857 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.012872 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.012888 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.012902 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.012915 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.012929 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.012943 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.012957 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.012971 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.012985 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.012999 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.013016 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.013033 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114384 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114432 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114453 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114469 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114484 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114499 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114498 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114513 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114633 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114665 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114550 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114553 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114563 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114541 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114643 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114542 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114726 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114745 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114745 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114760 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114772 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114780 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114794 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114798 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114816 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114817 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114831 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114841 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114862 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.114957 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.208749 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.209791 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.209833 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.209843 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.209868 4753 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 05 20:14:52 crc kubenswrapper[4753]: E1005 20:14:52.210275 4753 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.290424 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.303587 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.322036 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.328775 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.332627 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 05 20:14:52 crc kubenswrapper[4753]: W1005 20:14:52.339765 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-bd6fbedd08d7d9def8581238ea3df9cba76f82b131a22ace15ea77ec5d053352 WatchSource:0}: Error finding container bd6fbedd08d7d9def8581238ea3df9cba76f82b131a22ace15ea77ec5d053352: Status 404 returned error can't find the container with id bd6fbedd08d7d9def8581238ea3df9cba76f82b131a22ace15ea77ec5d053352 Oct 05 20:14:52 crc kubenswrapper[4753]: W1005 20:14:52.340371 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-c01a0d2316ba68c91601859e2cdbcdf477c294b41c109c9f1e7f4561604cfd14 WatchSource:0}: Error finding container c01a0d2316ba68c91601859e2cdbcdf477c294b41c109c9f1e7f4561604cfd14: Status 404 returned error can't find the container with id c01a0d2316ba68c91601859e2cdbcdf477c294b41c109c9f1e7f4561604cfd14 Oct 05 20:14:52 crc kubenswrapper[4753]: W1005 20:14:52.352100 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-bc2c8bd30434a182580b52f62fec3064c9068d7e7f41cbac09864b869cb9af20 WatchSource:0}: Error finding container bc2c8bd30434a182580b52f62fec3064c9068d7e7f41cbac09864b869cb9af20: Status 404 returned error can't find the container with id bc2c8bd30434a182580b52f62fec3064c9068d7e7f41cbac09864b869cb9af20 Oct 05 20:14:52 crc kubenswrapper[4753]: E1005 20:14:52.352615 4753 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186bb18b6abd50bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-05 20:14:51.76767302 +0000 UTC m=+0.616001262,LastTimestamp:2025-10-05 20:14:51.76767302 +0000 UTC m=+0.616001262,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 05 20:14:52 crc kubenswrapper[4753]: E1005 20:14:52.376046 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="800ms" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.610869 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.612549 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.612592 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.612604 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.612626 4753 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 05 20:14:52 crc kubenswrapper[4753]: E1005 20:14:52.613085 4753 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Oct 05 20:14:52 crc kubenswrapper[4753]: W1005 20:14:52.682622 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 05 20:14:52 crc kubenswrapper[4753]: E1005 20:14:52.682702 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.770249 4753 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 05 20:14:52 crc kubenswrapper[4753]: W1005 20:14:52.794899 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 05 20:14:52 crc kubenswrapper[4753]: E1005 20:14:52.794995 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.859169 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c01a0d2316ba68c91601859e2cdbcdf477c294b41c109c9f1e7f4561604cfd14"} Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.860323 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bc2c8bd30434a182580b52f62fec3064c9068d7e7f41cbac09864b869cb9af20"} Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.863027 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5dc10bd4d1ae726af728987deff534fa17ad326cbc2392ca9cf5652c4d803c8d"} Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.866282 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f8fcc1563efba1bd6d9fe89cf8c71f0876360092fa2cd87fa4ec9340375e5e28"} Oct 05 20:14:52 crc kubenswrapper[4753]: I1005 20:14:52.867198 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bd6fbedd08d7d9def8581238ea3df9cba76f82b131a22ace15ea77ec5d053352"} Oct 05 20:14:52 crc kubenswrapper[4753]: W1005 20:14:52.990159 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 05 20:14:52 crc kubenswrapper[4753]: E1005 20:14:52.990245 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 05 20:14:53 crc kubenswrapper[4753]: W1005 20:14:53.032265 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 05 20:14:53 crc kubenswrapper[4753]: E1005 20:14:53.032349 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 05 20:14:53 crc kubenswrapper[4753]: E1005 20:14:53.177281 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="1.6s" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.413788 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.414885 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.414931 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.414942 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.414988 4753 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 05 20:14:53 crc kubenswrapper[4753]: E1005 20:14:53.415542 4753 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.770611 4753 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.871256 4753 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543" exitCode=0 Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.871322 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543"} Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.871447 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.872387 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.872416 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.872426 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.874105 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.874651 4753 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5b3862dca9a162732558de66abc59d827f07c5ff1bd8a51d37d1799b60ba8b77" exitCode=0 Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.874705 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.874720 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5b3862dca9a162732558de66abc59d827f07c5ff1bd8a51d37d1799b60ba8b77"} Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.875050 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.875079 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.875091 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.875380 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.875399 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.875407 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.877438 4753 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0e86b41fe3af548448915a06369be226f2973795d2ac7b4da89e5611d9fdc548" exitCode=0 Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.877460 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0e86b41fe3af548448915a06369be226f2973795d2ac7b4da89e5611d9fdc548"} Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.877568 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.879213 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.879239 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.879251 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.879434 4753 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583" exitCode=0 Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.879482 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583"} Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.879557 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.880418 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.880442 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.880454 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.884981 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7"} Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.885329 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2"} Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.885345 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e"} Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.885356 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3"} Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.885440 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.886578 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.886602 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:53 crc kubenswrapper[4753]: I1005 20:14:53.886613 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:54 crc kubenswrapper[4753]: W1005 20:14:54.398282 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 05 20:14:54 crc kubenswrapper[4753]: E1005 20:14:54.398360 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.769942 4753 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 05 20:14:54 crc kubenswrapper[4753]: E1005 20:14:54.778610 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="3.2s" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.888917 4753 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fefc9957b862552c9cbaca63673c87e9d088e99264455f9e3db8b572dbe9be0d" exitCode=0 Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.889066 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.889066 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fefc9957b862552c9cbaca63673c87e9d088e99264455f9e3db8b572dbe9be0d"} Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.890253 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.890294 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.890309 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.891809 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1629003efbc02c7e0f18f7960a8002d9834606d268109b5986ac3c9ef3d004b3"} Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.891841 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cf427d5366f0edf753e57bb958b0e30b93716e0d46779a5c6fcca2db6794c868"} Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.891851 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b3162645925f249750ae6ef0136b28459a949ab5097da9ac48841312a0c1cc43"} Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.891907 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.893120 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.893153 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.893162 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.896046 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fb7661fffa9cd9d3a0b66c94345bfee1dd4a014f64f7f62b6a3bf9f4cbf21262"} Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.896105 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315"} Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.896120 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc"} Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.896130 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb"} Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.896159 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc"} Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.896563 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.897480 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.897622 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.897729 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.897960 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"359ee28cd996a4068246945cb7d6b0cb2921ea96607659b72f61679d8e819242"} Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.897978 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.898035 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.899011 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.899038 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.899048 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.899337 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.899382 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:54 crc kubenswrapper[4753]: I1005 20:14:54.899400 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:54 crc kubenswrapper[4753]: W1005 20:14:54.911848 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 05 20:14:54 crc kubenswrapper[4753]: E1005 20:14:54.911943 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.016302 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:55 crc kubenswrapper[4753]: W1005 20:14:55.016814 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Oct 05 20:14:55 crc kubenswrapper[4753]: E1005 20:14:55.018759 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.019632 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.019670 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.019682 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.019747 4753 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 05 20:14:55 crc kubenswrapper[4753]: E1005 20:14:55.020365 4753 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.584249 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.906099 4753 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="60975f522caf98398fb75fe07cf9f73203a2000860de4ca7bba29b376817270a" exitCode=0 Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.906186 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"60975f522caf98398fb75fe07cf9f73203a2000860de4ca7bba29b376817270a"} Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.906203 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.906260 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.906317 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.907040 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.907104 4753 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.907164 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.907383 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.907546 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.907564 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.907571 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.907635 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.907663 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.907680 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.908242 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.908293 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.908310 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.908333 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.908368 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.908378 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.908702 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.908721 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:55 crc kubenswrapper[4753]: I1005 20:14:55.908731 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:56 crc kubenswrapper[4753]: I1005 20:14:56.912257 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0198c51cab04bd8c2b4c225c54462dfaf7a92a00aa357463cef492efa09a69bb"} Oct 05 20:14:56 crc kubenswrapper[4753]: I1005 20:14:56.912310 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5b000406bb8af6650fe0b747727019be9448e6c19b6469e01a2355933688e105"} Oct 05 20:14:56 crc kubenswrapper[4753]: I1005 20:14:56.912327 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3b934eb95fb0ebd4eef3cbd8edadc0df5cd8c2e6f2c2559ed3441081869c0bba"} Oct 05 20:14:56 crc kubenswrapper[4753]: I1005 20:14:56.912339 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b23c13ec0f79486f1f239164b8ef9cfccf612b25349546cdcb47dcc797acae1a"} Oct 05 20:14:56 crc kubenswrapper[4753]: I1005 20:14:56.912350 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"49a91a72e74f6ab35cb62c18fad465ec6fe891f9802978677f3ef0f1d1b4ddd8"} Oct 05 20:14:56 crc kubenswrapper[4753]: I1005 20:14:56.912379 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:56 crc kubenswrapper[4753]: I1005 20:14:56.913185 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:56 crc kubenswrapper[4753]: I1005 20:14:56.913249 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:56 crc kubenswrapper[4753]: I1005 20:14:56.913269 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:56 crc kubenswrapper[4753]: I1005 20:14:56.913276 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:56 crc kubenswrapper[4753]: I1005 20:14:56.913831 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:56 crc kubenswrapper[4753]: I1005 20:14:56.913849 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:56 crc kubenswrapper[4753]: I1005 20:14:56.913857 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:57 crc kubenswrapper[4753]: I1005 20:14:57.286300 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:14:57 crc kubenswrapper[4753]: I1005 20:14:57.662726 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:14:57 crc kubenswrapper[4753]: I1005 20:14:57.914133 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:57 crc kubenswrapper[4753]: I1005 20:14:57.914932 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:57 crc kubenswrapper[4753]: I1005 20:14:57.915307 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:57 crc kubenswrapper[4753]: I1005 20:14:57.915350 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:57 crc kubenswrapper[4753]: I1005 20:14:57.915363 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:57 crc kubenswrapper[4753]: I1005 20:14:57.916910 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:57 crc kubenswrapper[4753]: I1005 20:14:57.916948 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:57 crc kubenswrapper[4753]: I1005 20:14:57.916964 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:58 crc kubenswrapper[4753]: I1005 20:14:58.221169 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:58 crc kubenswrapper[4753]: I1005 20:14:58.222349 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:58 crc kubenswrapper[4753]: I1005 20:14:58.222393 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:58 crc kubenswrapper[4753]: I1005 20:14:58.222406 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:58 crc kubenswrapper[4753]: I1005 20:14:58.222435 4753 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 05 20:14:58 crc kubenswrapper[4753]: I1005 20:14:58.360081 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 05 20:14:58 crc kubenswrapper[4753]: I1005 20:14:58.828369 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 05 20:14:58 crc kubenswrapper[4753]: I1005 20:14:58.916698 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:58 crc kubenswrapper[4753]: I1005 20:14:58.916699 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:58 crc kubenswrapper[4753]: I1005 20:14:58.917770 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:58 crc kubenswrapper[4753]: I1005 20:14:58.917834 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:58 crc kubenswrapper[4753]: I1005 20:14:58.917854 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:58 crc kubenswrapper[4753]: I1005 20:14:58.917862 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:58 crc kubenswrapper[4753]: I1005 20:14:58.917881 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:58 crc kubenswrapper[4753]: I1005 20:14:58.917889 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:59 crc kubenswrapper[4753]: I1005 20:14:59.920068 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:59 crc kubenswrapper[4753]: I1005 20:14:59.920992 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:59 crc kubenswrapper[4753]: I1005 20:14:59.921036 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:59 crc kubenswrapper[4753]: I1005 20:14:59.921060 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:59 crc kubenswrapper[4753]: I1005 20:14:59.960994 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 05 20:14:59 crc kubenswrapper[4753]: I1005 20:14:59.961282 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:14:59 crc kubenswrapper[4753]: I1005 20:14:59.962794 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:14:59 crc kubenswrapper[4753]: I1005 20:14:59.962832 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:14:59 crc kubenswrapper[4753]: I1005 20:14:59.962847 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:14:59 crc kubenswrapper[4753]: I1005 20:14:59.971484 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 05 20:15:00 crc kubenswrapper[4753]: I1005 20:15:00.184271 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 05 20:15:00 crc kubenswrapper[4753]: I1005 20:15:00.922499 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:15:00 crc kubenswrapper[4753]: I1005 20:15:00.923636 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:00 crc kubenswrapper[4753]: I1005 20:15:00.923781 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:00 crc kubenswrapper[4753]: I1005 20:15:00.923815 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:00 crc kubenswrapper[4753]: I1005 20:15:00.954622 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 05 20:15:01 crc kubenswrapper[4753]: E1005 20:15:01.917737 4753 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 05 20:15:01 crc kubenswrapper[4753]: I1005 20:15:01.924817 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:15:01 crc kubenswrapper[4753]: I1005 20:15:01.925655 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:01 crc kubenswrapper[4753]: I1005 20:15:01.925680 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:01 crc kubenswrapper[4753]: I1005 20:15:01.925717 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:02 crc kubenswrapper[4753]: I1005 20:15:02.212118 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 05 20:15:02 crc kubenswrapper[4753]: I1005 20:15:02.212376 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:15:02 crc kubenswrapper[4753]: I1005 20:15:02.213766 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:02 crc kubenswrapper[4753]: I1005 20:15:02.213823 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:02 crc kubenswrapper[4753]: I1005 20:15:02.213835 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:02 crc kubenswrapper[4753]: I1005 20:15:02.927459 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:15:02 crc kubenswrapper[4753]: I1005 20:15:02.928288 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:02 crc kubenswrapper[4753]: I1005 20:15:02.928321 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:02 crc kubenswrapper[4753]: I1005 20:15:02.928330 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:02 crc kubenswrapper[4753]: I1005 20:15:02.932340 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 05 20:15:03 crc kubenswrapper[4753]: I1005 20:15:03.184860 4753 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 05 20:15:03 crc kubenswrapper[4753]: I1005 20:15:03.184997 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 05 20:15:03 crc kubenswrapper[4753]: I1005 20:15:03.930262 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:15:03 crc kubenswrapper[4753]: I1005 20:15:03.931075 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:03 crc kubenswrapper[4753]: I1005 20:15:03.931111 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:03 crc kubenswrapper[4753]: I1005 20:15:03.931122 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:05 crc kubenswrapper[4753]: W1005 20:15:05.510043 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 05 20:15:05 crc kubenswrapper[4753]: I1005 20:15:05.510194 4753 trace.go:236] Trace[466197054]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Oct-2025 20:14:55.508) (total time: 10001ms): Oct 05 20:15:05 crc kubenswrapper[4753]: Trace[466197054]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (20:15:05.510) Oct 05 20:15:05 crc kubenswrapper[4753]: Trace[466197054]: [10.001272877s] [10.001272877s] END Oct 05 20:15:05 crc kubenswrapper[4753]: E1005 20:15:05.510231 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 05 20:15:05 crc kubenswrapper[4753]: I1005 20:15:05.727328 4753 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41980->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 05 20:15:05 crc kubenswrapper[4753]: I1005 20:15:05.727641 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41980->192.168.126.11:17697: read: connection reset by peer" Oct 05 20:15:05 crc kubenswrapper[4753]: I1005 20:15:05.771347 4753 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 05 20:15:05 crc kubenswrapper[4753]: I1005 20:15:05.938175 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 05 20:15:05 crc kubenswrapper[4753]: I1005 20:15:05.940521 4753 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fb7661fffa9cd9d3a0b66c94345bfee1dd4a014f64f7f62b6a3bf9f4cbf21262" exitCode=255 Oct 05 20:15:05 crc kubenswrapper[4753]: I1005 20:15:05.940598 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fb7661fffa9cd9d3a0b66c94345bfee1dd4a014f64f7f62b6a3bf9f4cbf21262"} Oct 05 20:15:05 crc kubenswrapper[4753]: I1005 20:15:05.940809 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:15:05 crc kubenswrapper[4753]: I1005 20:15:05.941680 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:05 crc kubenswrapper[4753]: I1005 20:15:05.941716 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:05 crc kubenswrapper[4753]: I1005 20:15:05.941750 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:05 crc kubenswrapper[4753]: I1005 20:15:05.942348 4753 scope.go:117] "RemoveContainer" containerID="fb7661fffa9cd9d3a0b66c94345bfee1dd4a014f64f7f62b6a3bf9f4cbf21262" Oct 05 20:15:06 crc kubenswrapper[4753]: I1005 20:15:06.383392 4753 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Oct 05 20:15:06 crc kubenswrapper[4753]: I1005 20:15:06.383477 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 05 20:15:06 crc kubenswrapper[4753]: I1005 20:15:06.395767 4753 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]log ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]etcd ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/generic-apiserver-start-informers ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/priority-and-fairness-filter ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/start-apiextensions-informers ok Oct 05 20:15:06 crc kubenswrapper[4753]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Oct 05 20:15:06 crc kubenswrapper[4753]: [-]poststarthook/crd-informer-synced failed: reason withheld Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/start-system-namespaces-controller ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 05 20:15:06 crc kubenswrapper[4753]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Oct 05 20:15:06 crc kubenswrapper[4753]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Oct 05 20:15:06 crc kubenswrapper[4753]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Oct 05 20:15:06 crc kubenswrapper[4753]: [-]poststarthook/bootstrap-controller failed: reason withheld Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/start-kube-aggregator-informers ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 05 20:15:06 crc kubenswrapper[4753]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 05 20:15:06 crc kubenswrapper[4753]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]autoregister-completion ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/apiservice-openapi-controller ok Oct 05 20:15:06 crc kubenswrapper[4753]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 05 20:15:06 crc kubenswrapper[4753]: livez check failed Oct 05 20:15:06 crc kubenswrapper[4753]: I1005 20:15:06.395855 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:15:06 crc kubenswrapper[4753]: I1005 20:15:06.944176 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 05 20:15:06 crc kubenswrapper[4753]: I1005 20:15:06.946204 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b"} Oct 05 20:15:06 crc kubenswrapper[4753]: I1005 20:15:06.946340 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:15:06 crc kubenswrapper[4753]: I1005 20:15:06.947131 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:06 crc kubenswrapper[4753]: I1005 20:15:06.947165 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:06 crc kubenswrapper[4753]: I1005 20:15:06.947173 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:07 crc kubenswrapper[4753]: I1005 20:15:07.667637 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:15:07 crc kubenswrapper[4753]: I1005 20:15:07.949086 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:15:07 crc kubenswrapper[4753]: I1005 20:15:07.949418 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:15:07 crc kubenswrapper[4753]: I1005 20:15:07.950581 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:07 crc kubenswrapper[4753]: I1005 20:15:07.950610 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:07 crc kubenswrapper[4753]: I1005 20:15:07.950619 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:07 crc kubenswrapper[4753]: I1005 20:15:07.952915 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:15:08 crc kubenswrapper[4753]: I1005 20:15:08.392704 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 05 20:15:08 crc kubenswrapper[4753]: I1005 20:15:08.393000 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:15:08 crc kubenswrapper[4753]: I1005 20:15:08.394623 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:08 crc kubenswrapper[4753]: I1005 20:15:08.394665 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:08 crc kubenswrapper[4753]: I1005 20:15:08.394679 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:08 crc kubenswrapper[4753]: I1005 20:15:08.410664 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 05 20:15:08 crc kubenswrapper[4753]: I1005 20:15:08.952757 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:15:08 crc kubenswrapper[4753]: I1005 20:15:08.952948 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:15:08 crc kubenswrapper[4753]: I1005 20:15:08.956171 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:08 crc kubenswrapper[4753]: I1005 20:15:08.956236 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:08 crc kubenswrapper[4753]: I1005 20:15:08.956257 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:08 crc kubenswrapper[4753]: I1005 20:15:08.959026 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:08 crc kubenswrapper[4753]: I1005 20:15:08.959085 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:08 crc kubenswrapper[4753]: I1005 20:15:08.959108 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:09 crc kubenswrapper[4753]: I1005 20:15:09.955089 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:15:09 crc kubenswrapper[4753]: I1005 20:15:09.956046 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:09 crc kubenswrapper[4753]: I1005 20:15:09.956076 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:09 crc kubenswrapper[4753]: I1005 20:15:09.956089 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:10 crc kubenswrapper[4753]: I1005 20:15:10.082392 4753 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.395037 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.398083 4753 trace.go:236] Trace[771587492]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Oct-2025 20:14:58.090) (total time: 13307ms): Oct 05 20:15:11 crc kubenswrapper[4753]: Trace[771587492]: ---"Objects listed" error: 13307ms (20:15:11.397) Oct 05 20:15:11 crc kubenswrapper[4753]: Trace[771587492]: [13.307886118s] [13.307886118s] END Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.398167 4753 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.398182 4753 trace.go:236] Trace[967458458]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Oct-2025 20:15:00.146) (total time: 11251ms): Oct 05 20:15:11 crc kubenswrapper[4753]: Trace[967458458]: ---"Objects listed" error: 11251ms (20:15:11.398) Oct 05 20:15:11 crc kubenswrapper[4753]: Trace[967458458]: [11.251822627s] [11.251822627s] END Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.398197 4753 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.398307 4753 trace.go:236] Trace[195533645]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Oct-2025 20:14:58.251) (total time: 13146ms): Oct 05 20:15:11 crc kubenswrapper[4753]: Trace[195533645]: ---"Objects listed" error: 13146ms (20:15:11.398) Oct 05 20:15:11 crc kubenswrapper[4753]: Trace[195533645]: [13.146298875s] [13.146298875s] END Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.398339 4753 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.399336 4753 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.400544 4753 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.764710 4753 apiserver.go:52] "Watching apiserver" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.766869 4753 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.767088 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.767395 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.767772 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.767788 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.767812 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.767826 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.767874 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.767986 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.767535 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.768301 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.769182 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.769490 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.769690 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.770224 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.770266 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.770798 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.770824 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.772004 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.772234 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.772932 4753 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.796036 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804168 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804253 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804280 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804307 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804329 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804350 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804370 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804392 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804418 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804450 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804481 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804505 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804528 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804552 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804573 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804594 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804616 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804641 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804663 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804688 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804713 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804734 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804755 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804777 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804801 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804826 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804847 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804871 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804892 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804914 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804943 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804966 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.804989 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805011 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805042 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805065 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805086 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805115 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805170 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805207 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805229 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805252 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805296 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805323 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805347 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805369 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805391 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805414 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805434 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805455 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805477 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805500 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805521 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805543 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805569 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805596 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805622 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805647 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805668 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805689 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805721 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805751 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805784 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805815 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805846 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805879 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805902 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805925 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805950 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805971 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.805996 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806019 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806049 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806079 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806106 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806130 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806183 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806206 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806227 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806254 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806281 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806303 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806326 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806349 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806371 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806392 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806413 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806438 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806459 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806482 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806505 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806527 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806551 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806574 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806596 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806657 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806684 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806709 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806731 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806752 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806776 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806801 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806825 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806850 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806873 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806894 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806916 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806937 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806963 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806987 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807009 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807033 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807058 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807080 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807102 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807127 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807166 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807190 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807213 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807236 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807260 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807283 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807308 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807332 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807365 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807388 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807418 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807445 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807469 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807494 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807527 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807550 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807573 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807597 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807620 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807643 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807667 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807690 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807717 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807740 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.810232 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.810361 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.810456 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.810550 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.810646 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.810789 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.810873 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.810969 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.811119 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.811251 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.811355 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.811447 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.811593 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.811688 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.811780 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.811859 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.811953 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.812049 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.812134 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.812254 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.812352 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.812442 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.812550 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.812639 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.812730 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.812821 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.812921 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.813014 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.813098 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.813204 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.813306 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.813398 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.813524 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.813625 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.813717 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.813803 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.813892 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.813979 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.814059 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.814280 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.814386 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.814478 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.814571 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.814662 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.814761 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.814858 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.814958 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.815054 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.815168 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.815268 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.815370 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.815470 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.815559 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.815653 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.815752 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.815847 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.815994 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.816106 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.816236 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.815534 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.820199 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.820838 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.820887 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.820933 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.820963 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.821023 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.821045 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.821070 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.821102 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.821127 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.821161 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.821207 4753 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806152 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806168 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806312 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.824050 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.824072 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806331 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806454 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806477 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806614 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806637 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806639 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806748 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806770 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806918 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.806945 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807072 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807089 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807231 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807267 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807555 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.810131 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.810990 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.811047 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.811298 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.811336 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.811364 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.812091 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.812123 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.812546 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.812824 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.814677 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.807791 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.815523 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.815745 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.815979 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.816561 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.824455 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.816699 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.816703 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.816863 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.818168 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.818272 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.818650 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.818822 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.818853 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.818983 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.819231 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.819319 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.819554 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.819691 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.824452 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.819837 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.820155 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.820133 4753 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.820179 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.820330 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.820338 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.820353 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.820369 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.820586 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.821045 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.821421 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.820222 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.824129 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.824545 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.824630 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.824765 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.824860 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.825577 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.825622 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.825892 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.825897 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.826014 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:15:12.325993362 +0000 UTC m=+21.174321664 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.826459 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.826486 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.826923 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.827109 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.827352 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.827578 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.827636 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.827669 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.828414 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.828280 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.828032 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.828471 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.828475 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.827555 4753 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.828541 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.829038 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.829072 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.830571 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.829082 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.829337 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.829809 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.829883 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.830197 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.830369 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.830407 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.830727 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.830798 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.831299 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.831406 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:12.331391268 +0000 UTC m=+21.179719580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.831516 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.831629 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.831746 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.832423 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.832438 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.832539 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.832780 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.832800 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.832883 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.832971 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.833051 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.833158 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.833133 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.833318 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.833748 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.833971 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.834420 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.834657 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.834913 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.835164 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.835210 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.835346 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.835603 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.835639 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.835749 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.835659 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.835940 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.836048 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.836075 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.836099 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.836410 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.836524 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.836659 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.836879 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.836640 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.837171 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.837299 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.837628 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.838028 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.838176 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.838233 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.838250 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.838294 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.838462 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.838539 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.839403 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.839594 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.839914 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.839948 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.840320 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.840524 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.840871 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.841003 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.841265 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.842311 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.842436 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.842642 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.842661 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.842673 4753 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.842726 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:12.342709287 +0000 UTC m=+21.191037629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.843653 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.845620 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.845701 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.845857 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.846034 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.846307 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.846672 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.846730 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.846880 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.846953 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.847078 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.847293 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.847510 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.847529 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.847608 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.847821 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.847858 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.848030 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.848076 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.848369 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.849148 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.849308 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.849323 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.849845 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.849909 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.850170 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.850306 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.850534 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.850677 4753 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.850748 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:12.350729366 +0000 UTC m=+21.199057618 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.851447 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.851597 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.851808 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.853006 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.853344 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.853520 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.853677 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.853704 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.853729 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.854230 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.855212 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.860070 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.860323 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.863380 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.863427 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.863448 4753 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:11 crc kubenswrapper[4753]: E1005 20:15:11.863530 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:12.3635022 +0000 UTC m=+21.211830523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.863644 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.865567 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.866956 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.867994 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.870125 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.870612 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.873319 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.875373 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.876611 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.878162 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.878659 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.879749 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.880459 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.881421 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.882703 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.883401 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.883376 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.884495 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.885202 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.886232 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.886851 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.887916 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.888464 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.889017 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.889945 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.890556 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.891424 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.891691 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.892220 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.892914 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.894172 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.894790 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.896026 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.896802 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.898156 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.898743 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.899607 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.900931 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.901482 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.902516 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.902610 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.903225 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.904256 4753 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.904392 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.906453 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.907380 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.907774 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.909425 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.911358 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.913380 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.913565 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.917276 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.918590 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.919584 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.920204 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.921264 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.921898 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.922062 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.922157 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.922841 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.922902 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.922918 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.922929 4753 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.922062 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.922991 4753 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.922398 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923116 4753 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923190 4753 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923239 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923252 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923264 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923275 4753 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923306 4753 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923319 4753 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923331 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923343 4753 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923349 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923357 4753 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923425 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923492 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923509 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923581 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923596 4753 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923636 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923664 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923676 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923831 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923852 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923886 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923901 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923912 4753 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923923 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923935 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923971 4753 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923983 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.923994 4753 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924007 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924051 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924066 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924077 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924088 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924006 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924133 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924188 4753 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924200 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924212 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924263 4753 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924275 4753 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924290 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924323 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924335 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924347 4753 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924358 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924370 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924403 4753 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924415 4753 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924427 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924446 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924458 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924492 4753 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924504 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924515 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924526 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924536 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924686 4753 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924701 4753 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924711 4753 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924723 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924735 4753 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924746 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924759 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924771 4753 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924783 4753 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924795 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924807 4753 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924818 4753 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924830 4753 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924842 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924853 4753 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924864 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924876 4753 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924887 4753 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924898 4753 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924908 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924918 4753 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924930 4753 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924941 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924953 4753 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924964 4753 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924975 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924985 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.924997 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925008 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925020 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925031 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925042 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925053 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925065 4753 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925076 4753 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925092 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925102 4753 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925131 4753 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925159 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925170 4753 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925181 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925230 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925244 4753 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925254 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925265 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925318 4753 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925332 4753 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925344 4753 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925355 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925366 4753 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925399 4753 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925412 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925422 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925433 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925446 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925479 4753 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925492 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925503 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925515 4753 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925526 4753 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925558 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925572 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925606 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925660 4753 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925672 4753 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925684 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925695 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925729 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925740 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925751 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925763 4753 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.925501 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926035 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926051 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926063 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926074 4753 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926106 4753 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926119 4753 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926130 4753 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926176 4753 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926188 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926199 4753 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926210 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926221 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926253 4753 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926264 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926275 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926285 4753 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926296 4753 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926327 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926339 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926351 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926361 4753 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926372 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926383 4753 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926416 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926426 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926437 4753 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926450 4753 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926798 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926812 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926823 4753 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926861 4753 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926873 4753 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926884 4753 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.927010 4753 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.927024 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.927085 4753 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.927098 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.927109 4753 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926676 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.928208 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.928818 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929189 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929215 4753 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929224 4753 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929232 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929241 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929249 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929257 4753 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929265 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929289 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929299 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929307 4753 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929315 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929323 4753 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929331 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929338 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929347 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929355 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929362 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929379 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929386 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929394 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.929402 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.926734 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.931986 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.932676 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.933661 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.935025 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.935748 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.941206 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.952228 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:11 crc kubenswrapper[4753]: I1005 20:15:11.960948 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.080357 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.089481 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.095514 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 05 20:15:12 crc kubenswrapper[4753]: W1005 20:15:12.103994 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-7ce4a637ca5f60c27ffab79384ccd50e4d4c29c12ede27728e45142a393523c5 WatchSource:0}: Error finding container 7ce4a637ca5f60c27ffab79384ccd50e4d4c29c12ede27728e45142a393523c5: Status 404 returned error can't find the container with id 7ce4a637ca5f60c27ffab79384ccd50e4d4c29c12ede27728e45142a393523c5 Oct 05 20:15:12 crc kubenswrapper[4753]: W1005 20:15:12.108823 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-ab49c9abcd9af32ae8bce6a375f99f8fbafc479ba81d4885b8619a3af3f6a276 WatchSource:0}: Error finding container ab49c9abcd9af32ae8bce6a375f99f8fbafc479ba81d4885b8619a3af3f6a276: Status 404 returned error can't find the container with id ab49c9abcd9af32ae8bce6a375f99f8fbafc479ba81d4885b8619a3af3f6a276 Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.238555 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.257931 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.258662 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.262654 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.280301 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.293934 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.310930 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.325124 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.335451 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.335541 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:12 crc kubenswrapper[4753]: E1005 20:15:12.335625 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:15:13.335596078 +0000 UTC m=+22.183924320 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:15:12 crc kubenswrapper[4753]: E1005 20:15:12.335654 4753 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 05 20:15:12 crc kubenswrapper[4753]: E1005 20:15:12.335708 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:13.335694531 +0000 UTC m=+22.184022763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.338571 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.356154 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.368713 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.380226 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.391035 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.401449 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.412537 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.422079 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.436436 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.436500 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.436522 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:12 crc kubenswrapper[4753]: E1005 20:15:12.436605 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 05 20:15:12 crc kubenswrapper[4753]: E1005 20:15:12.436634 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 05 20:15:12 crc kubenswrapper[4753]: E1005 20:15:12.436647 4753 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:12 crc kubenswrapper[4753]: E1005 20:15:12.436700 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:13.436684071 +0000 UTC m=+22.285012303 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:12 crc kubenswrapper[4753]: E1005 20:15:12.436605 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 05 20:15:12 crc kubenswrapper[4753]: E1005 20:15:12.436725 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 05 20:15:12 crc kubenswrapper[4753]: E1005 20:15:12.436733 4753 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:12 crc kubenswrapper[4753]: E1005 20:15:12.436755 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:13.436748443 +0000 UTC m=+22.285076665 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:12 crc kubenswrapper[4753]: E1005 20:15:12.436608 4753 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 05 20:15:12 crc kubenswrapper[4753]: E1005 20:15:12.437047 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:13.437040132 +0000 UTC m=+22.285368364 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.965638 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e"} Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.965690 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a"} Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.965703 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ab49c9abcd9af32ae8bce6a375f99f8fbafc479ba81d4885b8619a3af3f6a276"} Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.971739 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7ce4a637ca5f60c27ffab79384ccd50e4d4c29c12ede27728e45142a393523c5"} Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.979172 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d"} Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.979480 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"02622fbd3fb3641554d0e821920c58287718fd8198bf87b8bd2f003bac4ea016"} Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.980508 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.980994 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.982618 4753 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b" exitCode=255 Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.983114 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b"} Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.983229 4753 scope.go:117] "RemoveContainer" containerID="fb7661fffa9cd9d3a0b66c94345bfee1dd4a014f64f7f62b6a3bf9f4cbf21262" Oct 05 20:15:12 crc kubenswrapper[4753]: I1005 20:15:12.986021 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:12Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.000130 4753 scope.go:117] "RemoveContainer" containerID="b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b" Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.000424 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.002261 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.005190 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.022862 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.042663 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.061088 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.076773 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.091284 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.118826 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.137759 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.152078 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.173025 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.187547 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.209954 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb7661fffa9cd9d3a0b66c94345bfee1dd4a014f64f7f62b6a3bf9f4cbf21262\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:05Z\\\",\\\"message\\\":\\\"W1005 20:14:54.941025 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1005 20:14:54.941356 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759695294 cert, and key in /tmp/serving-cert-816832640/serving-signer.crt, /tmp/serving-cert-816832640/serving-signer.key\\\\nI1005 20:14:55.237753 1 observer_polling.go:159] Starting file observer\\\\nW1005 20:14:55.240629 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1005 20:14:55.240831 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:14:55.244632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-816832640/tls.crt::/tmp/serving-cert-816832640/tls.key\\\\\\\"\\\\nF1005 20:15:05.717570 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.230052 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.256622 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.344108 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.344314 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:15:15.344283835 +0000 UTC m=+24.192612067 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.344405 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.344616 4753 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.344704 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:15.344682617 +0000 UTC m=+24.193010849 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.445324 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.445390 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.445416 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.445521 4753 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.445595 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:15.445575725 +0000 UTC m=+24.293903957 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.445608 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.445700 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.445720 4753 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.445641 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.445812 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.445822 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:15.445784431 +0000 UTC m=+24.294112813 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.445832 4753 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.445936 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:15.445912725 +0000 UTC m=+24.294241137 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.457480 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-bcrnh"] Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.457908 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bcrnh" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.462035 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.462046 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.462307 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.488771 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.521723 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.546079 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0f4dc067-d682-4823-8969-64a0184e623d-hosts-file\") pod \"node-resolver-bcrnh\" (UID: \"0f4dc067-d682-4823-8969-64a0184e623d\") " pod="openshift-dns/node-resolver-bcrnh" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.546133 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4l9h\" (UniqueName: \"kubernetes.io/projected/0f4dc067-d682-4823-8969-64a0184e623d-kube-api-access-m4l9h\") pod \"node-resolver-bcrnh\" (UID: \"0f4dc067-d682-4823-8969-64a0184e623d\") " pod="openshift-dns/node-resolver-bcrnh" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.549078 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.561372 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.579451 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb7661fffa9cd9d3a0b66c94345bfee1dd4a014f64f7f62b6a3bf9f4cbf21262\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:05Z\\\",\\\"message\\\":\\\"W1005 20:14:54.941025 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1005 20:14:54.941356 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759695294 cert, and key in /tmp/serving-cert-816832640/serving-signer.crt, /tmp/serving-cert-816832640/serving-signer.key\\\\nI1005 20:14:55.237753 1 observer_polling.go:159] Starting file observer\\\\nW1005 20:14:55.240629 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1005 20:14:55.240831 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:14:55.244632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-816832640/tls.crt::/tmp/serving-cert-816832640/tls.key\\\\\\\"\\\\nF1005 20:15:05.717570 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.592726 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.607544 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.622857 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.644312 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:13Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.646605 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0f4dc067-d682-4823-8969-64a0184e623d-hosts-file\") pod \"node-resolver-bcrnh\" (UID: \"0f4dc067-d682-4823-8969-64a0184e623d\") " pod="openshift-dns/node-resolver-bcrnh" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.646647 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4l9h\" (UniqueName: \"kubernetes.io/projected/0f4dc067-d682-4823-8969-64a0184e623d-kube-api-access-m4l9h\") pod \"node-resolver-bcrnh\" (UID: \"0f4dc067-d682-4823-8969-64a0184e623d\") " pod="openshift-dns/node-resolver-bcrnh" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.646808 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0f4dc067-d682-4823-8969-64a0184e623d-hosts-file\") pod \"node-resolver-bcrnh\" (UID: \"0f4dc067-d682-4823-8969-64a0184e623d\") " pod="openshift-dns/node-resolver-bcrnh" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.663868 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4l9h\" (UniqueName: \"kubernetes.io/projected/0f4dc067-d682-4823-8969-64a0184e623d-kube-api-access-m4l9h\") pod \"node-resolver-bcrnh\" (UID: \"0f4dc067-d682-4823-8969-64a0184e623d\") " pod="openshift-dns/node-resolver-bcrnh" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.770738 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bcrnh" Oct 05 20:15:13 crc kubenswrapper[4753]: W1005 20:15:13.783670 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f4dc067_d682_4823_8969_64a0184e623d.slice/crio-4e060e58860232ecff0f80f9b3b06ddf94c94404d481e9600590319ceb249b4f WatchSource:0}: Error finding container 4e060e58860232ecff0f80f9b3b06ddf94c94404d481e9600590319ceb249b4f: Status 404 returned error can't find the container with id 4e060e58860232ecff0f80f9b3b06ddf94c94404d481e9600590319ceb249b4f Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.853813 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.853942 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.854015 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.854067 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.854118 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.854381 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.987228 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bcrnh" event={"ID":"0f4dc067-d682-4823-8969-64a0184e623d","Type":"ContainerStarted","Data":"4e060e58860232ecff0f80f9b3b06ddf94c94404d481e9600590319ceb249b4f"} Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.988967 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 05 20:15:13 crc kubenswrapper[4753]: I1005 20:15:13.991517 4753 scope.go:117] "RemoveContainer" containerID="b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b" Oct 05 20:15:13 crc kubenswrapper[4753]: E1005 20:15:13.991631 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.025018 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.043803 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.072100 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.089724 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.105649 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.121069 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.139192 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.158468 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.182676 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.342467 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-xlrkd"] Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.342841 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.344424 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zr5q8"] Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.344806 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.345243 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.346235 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.346622 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.346818 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.347463 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.347619 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.348359 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.348515 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.349166 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.349449 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.368260 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.384244 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454316 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-system-cni-dir\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454373 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-etc-kubernetes\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454394 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a422d983-1769-4d79-9e71-b63bef552d37-proxy-tls\") pod \"machine-config-daemon-xlrkd\" (UID: \"a422d983-1769-4d79-9e71-b63bef552d37\") " pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454416 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a422d983-1769-4d79-9e71-b63bef552d37-mcd-auth-proxy-config\") pod \"machine-config-daemon-xlrkd\" (UID: \"a422d983-1769-4d79-9e71-b63bef552d37\") " pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454442 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-multus-socket-dir-parent\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454460 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8a6cead6-0872-4b49-a08c-529805f646f2-multus-daemon-config\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454480 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pj69\" (UniqueName: \"kubernetes.io/projected/8a6cead6-0872-4b49-a08c-529805f646f2-kube-api-access-9pj69\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454500 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-multus-cni-dir\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454520 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-multus-conf-dir\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454539 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-hostroot\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454557 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-var-lib-kubelet\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454577 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-os-release\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454599 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-var-lib-cni-multus\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454617 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rd6q\" (UniqueName: \"kubernetes.io/projected/a422d983-1769-4d79-9e71-b63bef552d37-kube-api-access-6rd6q\") pod \"machine-config-daemon-xlrkd\" (UID: \"a422d983-1769-4d79-9e71-b63bef552d37\") " pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454637 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-run-k8s-cni-cncf-io\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454676 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8a6cead6-0872-4b49-a08c-529805f646f2-cni-binary-copy\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454695 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-run-multus-certs\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454713 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-var-lib-cni-bin\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454741 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-cnibin\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454759 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-run-netns\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.454776 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a422d983-1769-4d79-9e71-b63bef552d37-rootfs\") pod \"machine-config-daemon-xlrkd\" (UID: \"a422d983-1769-4d79-9e71-b63bef552d37\") " pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.468528 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.479620 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.489912 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.502263 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.512640 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.524996 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.535562 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.547022 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555395 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-hostroot\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555431 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-os-release\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555452 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-var-lib-cni-multus\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555470 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-var-lib-kubelet\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555492 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rd6q\" (UniqueName: \"kubernetes.io/projected/a422d983-1769-4d79-9e71-b63bef552d37-kube-api-access-6rd6q\") pod \"machine-config-daemon-xlrkd\" (UID: \"a422d983-1769-4d79-9e71-b63bef552d37\") " pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555525 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-hostroot\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555590 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-var-lib-cni-multus\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555615 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-var-lib-kubelet\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555641 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8a6cead6-0872-4b49-a08c-529805f646f2-cni-binary-copy\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555661 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-run-k8s-cni-cncf-io\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555684 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-run-multus-certs\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555699 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-cnibin\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555713 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-run-netns\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555752 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-var-lib-cni-bin\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555782 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a422d983-1769-4d79-9e71-b63bef552d37-rootfs\") pod \"machine-config-daemon-xlrkd\" (UID: \"a422d983-1769-4d79-9e71-b63bef552d37\") " pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555778 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-os-release\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555796 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a422d983-1769-4d79-9e71-b63bef552d37-proxy-tls\") pod \"machine-config-daemon-xlrkd\" (UID: \"a422d983-1769-4d79-9e71-b63bef552d37\") " pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555861 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a422d983-1769-4d79-9e71-b63bef552d37-mcd-auth-proxy-config\") pod \"machine-config-daemon-xlrkd\" (UID: \"a422d983-1769-4d79-9e71-b63bef552d37\") " pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555866 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-cnibin\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555904 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-system-cni-dir\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555926 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-etc-kubernetes\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555945 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pj69\" (UniqueName: \"kubernetes.io/projected/8a6cead6-0872-4b49-a08c-529805f646f2-kube-api-access-9pj69\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555962 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-multus-socket-dir-parent\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555978 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8a6cead6-0872-4b49-a08c-529805f646f2-multus-daemon-config\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.555997 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-multus-cni-dir\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.556011 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-multus-conf-dir\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.556046 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-multus-conf-dir\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.556288 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-etc-kubernetes\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.556355 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-run-k8s-cni-cncf-io\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.556381 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-run-multus-certs\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.556483 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-system-cni-dir\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.556506 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-var-lib-cni-bin\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.556526 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-host-run-netns\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.556559 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-multus-socket-dir-parent\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.556558 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a422d983-1769-4d79-9e71-b63bef552d37-mcd-auth-proxy-config\") pod \"machine-config-daemon-xlrkd\" (UID: \"a422d983-1769-4d79-9e71-b63bef552d37\") " pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.556646 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8a6cead6-0872-4b49-a08c-529805f646f2-cni-binary-copy\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.556783 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a422d983-1769-4d79-9e71-b63bef552d37-rootfs\") pod \"machine-config-daemon-xlrkd\" (UID: \"a422d983-1769-4d79-9e71-b63bef552d37\") " pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.556797 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a6cead6-0872-4b49-a08c-529805f646f2-multus-cni-dir\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.556927 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8a6cead6-0872-4b49-a08c-529805f646f2-multus-daemon-config\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.560769 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a422d983-1769-4d79-9e71-b63bef552d37-proxy-tls\") pod \"machine-config-daemon-xlrkd\" (UID: \"a422d983-1769-4d79-9e71-b63bef552d37\") " pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.563936 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.569642 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rd6q\" (UniqueName: \"kubernetes.io/projected/a422d983-1769-4d79-9e71-b63bef552d37-kube-api-access-6rd6q\") pod \"machine-config-daemon-xlrkd\" (UID: \"a422d983-1769-4d79-9e71-b63bef552d37\") " pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.572866 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pj69\" (UniqueName: \"kubernetes.io/projected/8a6cead6-0872-4b49-a08c-529805f646f2-kube-api-access-9pj69\") pod \"multus-zr5q8\" (UID: \"8a6cead6-0872-4b49-a08c-529805f646f2\") " pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.579766 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.590055 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.603849 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.616207 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.627237 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.642296 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.653960 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.655712 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.661358 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zr5q8" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.678738 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.694786 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.712255 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.749605 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-htbfn"] Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.750307 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.752440 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.752561 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.756075 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-25jcm"] Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.756524 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.756731 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.756739 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.756739 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.756990 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.758943 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.765807 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.766644 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.772514 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.784771 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.798455 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.813796 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.827310 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.843275 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.858448 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865294 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovnkube-script-lib\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865344 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7j8m\" (UniqueName: \"kubernetes.io/projected/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-kube-api-access-k7j8m\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865400 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5bbfd1eb-16b3-420d-acab-5770837c14fc-cni-binary-copy\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865497 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865573 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-env-overrides\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865605 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovn-node-metrics-cert\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865649 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-slash\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865684 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-kubelet\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865706 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-etc-openvswitch\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865728 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-ovn\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865795 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-node-log\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865821 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-log-socket\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865852 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovnkube-config\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865882 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd8z9\" (UniqueName: \"kubernetes.io/projected/5bbfd1eb-16b3-420d-acab-5770837c14fc-kube-api-access-kd8z9\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865902 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-var-lib-openvswitch\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865922 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-run-ovn-kubernetes\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865952 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-systemd\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.865992 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5bbfd1eb-16b3-420d-acab-5770837c14fc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.866014 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-cni-netd\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.866044 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5bbfd1eb-16b3-420d-acab-5770837c14fc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.866070 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5bbfd1eb-16b3-420d-acab-5770837c14fc-os-release\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.866094 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-systemd-units\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.866125 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-openvswitch\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.866372 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5bbfd1eb-16b3-420d-acab-5770837c14fc-system-cni-dir\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.866416 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5bbfd1eb-16b3-420d-acab-5770837c14fc-cnibin\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.866442 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-run-netns\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.866470 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-cni-bin\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.871741 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.893920 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.908589 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.926001 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.949737 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.967184 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.967644 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5bbfd1eb-16b3-420d-acab-5770837c14fc-system-cni-dir\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.967737 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5bbfd1eb-16b3-420d-acab-5770837c14fc-cnibin\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.967816 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-run-netns\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.967894 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-cni-bin\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.967979 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5bbfd1eb-16b3-420d-acab-5770837c14fc-cni-binary-copy\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.967757 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5bbfd1eb-16b3-420d-acab-5770837c14fc-system-cni-dir\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.967782 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5bbfd1eb-16b3-420d-acab-5770837c14fc-cnibin\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.967921 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-cni-bin\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.968190 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovnkube-script-lib\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.968264 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7j8m\" (UniqueName: \"kubernetes.io/projected/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-kube-api-access-k7j8m\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.968333 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.968401 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-env-overrides\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.968484 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-slash\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.968564 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovn-node-metrics-cert\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.968634 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-kubelet\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.968706 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-etc-openvswitch\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.968769 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.967897 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-run-netns\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.968864 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-ovn\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.968934 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-node-log\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.968998 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-log-socket\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.969085 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovnkube-config\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.969175 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd8z9\" (UniqueName: \"kubernetes.io/projected/5bbfd1eb-16b3-420d-acab-5770837c14fc-kube-api-access-kd8z9\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.969255 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-var-lib-openvswitch\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.969334 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-run-ovn-kubernetes\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.969732 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-run-ovn-kubernetes\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.969795 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-ovn\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.969945 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-node-log\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.969980 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5bbfd1eb-16b3-420d-acab-5770837c14fc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.970094 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-systemd\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.970211 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-cni-netd\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.970320 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5bbfd1eb-16b3-420d-acab-5770837c14fc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.970396 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-slash\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.970408 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5bbfd1eb-16b3-420d-acab-5770837c14fc-os-release\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.970594 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-systemd-units\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.970722 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-var-lib-openvswitch\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.969277 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovnkube-script-lib\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.970761 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-etc-openvswitch\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.970826 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-systemd\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.970837 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-kubelet\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.970361 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-env-overrides\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.970883 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovnkube-config\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.970239 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-log-socket\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.968742 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5bbfd1eb-16b3-420d-acab-5770837c14fc-cni-binary-copy\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.970970 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5bbfd1eb-16b3-420d-acab-5770837c14fc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.970987 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5bbfd1eb-16b3-420d-acab-5770837c14fc-os-release\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.970597 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-cni-netd\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.971105 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-systemd-units\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.971313 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-openvswitch\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.971568 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-openvswitch\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.975963 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovn-node-metrics-cert\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.976233 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5bbfd1eb-16b3-420d-acab-5770837c14fc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.993936 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.994920 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598"} Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.996232 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr5q8" event={"ID":"8a6cead6-0872-4b49-a08c-529805f646f2","Type":"ContainerStarted","Data":"a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039"} Oct 05 20:15:14 crc kubenswrapper[4753]: I1005 20:15:14.996261 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr5q8" event={"ID":"8a6cead6-0872-4b49-a08c-529805f646f2","Type":"ContainerStarted","Data":"44f48d2cd7d23a7fa25481cc6afcb5e0b7fe36ed361a7273f165dcb4dd289efc"} Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.003213 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd8z9\" (UniqueName: \"kubernetes.io/projected/5bbfd1eb-16b3-420d-acab-5770837c14fc-kube-api-access-kd8z9\") pod \"multus-additional-cni-plugins-25jcm\" (UID: \"5bbfd1eb-16b3-420d-acab-5770837c14fc\") " pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.012208 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef"} Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.012367 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"352f86d13b0f2ef3587b35d04fddb8935f863c1b87207bfbf41bca295ebbfb3f"} Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.012318 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7j8m\" (UniqueName: \"kubernetes.io/projected/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-kube-api-access-k7j8m\") pod \"ovnkube-node-htbfn\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.014458 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bcrnh" event={"ID":"0f4dc067-d682-4823-8969-64a0184e623d","Type":"ContainerStarted","Data":"03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914"} Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.036222 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.063038 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.067480 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.081129 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-25jcm" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.090126 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: W1005 20:15:15.127437 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bbfd1eb_16b3_420d_acab_5770837c14fc.slice/crio-0e27f6291d4b4e84f5c5a8fed49e492e7c8134db15a6bd47469b9aa2fa8f8d94 WatchSource:0}: Error finding container 0e27f6291d4b4e84f5c5a8fed49e492e7c8134db15a6bd47469b9aa2fa8f8d94: Status 404 returned error can't find the container with id 0e27f6291d4b4e84f5c5a8fed49e492e7c8134db15a6bd47469b9aa2fa8f8d94 Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.145420 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.176820 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.194957 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.205960 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.215777 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.228031 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.239214 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.250490 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.263449 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.275651 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.286859 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.298367 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.309658 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.322214 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.333367 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.344381 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.355160 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.370745 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.375006 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.375107 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:15 crc kubenswrapper[4753]: E1005 20:15:15.375194 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:15:19.375170467 +0000 UTC m=+28.223498699 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:15:15 crc kubenswrapper[4753]: E1005 20:15:15.375271 4753 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 05 20:15:15 crc kubenswrapper[4753]: E1005 20:15:15.375325 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:19.375314002 +0000 UTC m=+28.223642234 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.381891 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.397847 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.413574 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.475627 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.475685 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.475721 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:15 crc kubenswrapper[4753]: E1005 20:15:15.475785 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 05 20:15:15 crc kubenswrapper[4753]: E1005 20:15:15.475814 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 05 20:15:15 crc kubenswrapper[4753]: E1005 20:15:15.475822 4753 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 05 20:15:15 crc kubenswrapper[4753]: E1005 20:15:15.475877 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:19.475861778 +0000 UTC m=+28.324190010 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 05 20:15:15 crc kubenswrapper[4753]: E1005 20:15:15.475879 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 05 20:15:15 crc kubenswrapper[4753]: E1005 20:15:15.475899 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 05 20:15:15 crc kubenswrapper[4753]: E1005 20:15:15.475825 4753 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:15 crc kubenswrapper[4753]: E1005 20:15:15.475911 4753 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:15 crc kubenswrapper[4753]: E1005 20:15:15.475946 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:19.47593642 +0000 UTC m=+28.324264652 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:15 crc kubenswrapper[4753]: E1005 20:15:15.475964 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:19.475956451 +0000 UTC m=+28.324284683 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.851177 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.851190 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:15 crc kubenswrapper[4753]: E1005 20:15:15.851739 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:15 crc kubenswrapper[4753]: I1005 20:15:15.851237 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:15 crc kubenswrapper[4753]: E1005 20:15:15.851852 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:15 crc kubenswrapper[4753]: E1005 20:15:15.851932 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.018990 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9"} Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.027068 4753 generic.go:334] "Generic (PLEG): container finished" podID="5bbfd1eb-16b3-420d-acab-5770837c14fc" containerID="1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8" exitCode=0 Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.027269 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" event={"ID":"5bbfd1eb-16b3-420d-acab-5770837c14fc","Type":"ContainerDied","Data":"1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8"} Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.027338 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" event={"ID":"5bbfd1eb-16b3-420d-acab-5770837c14fc","Type":"ContainerStarted","Data":"0e27f6291d4b4e84f5c5a8fed49e492e7c8134db15a6bd47469b9aa2fa8f8d94"} Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.029876 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerID="4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9" exitCode=0 Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.030086 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerDied","Data":"4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9"} Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.030169 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerStarted","Data":"aae37b94c3f6b9a2fa8bd20b3a8be44fa4f89f3ef663db20e9978ff3f436eb06"} Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.047339 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.083763 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.105326 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.129004 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.143851 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.163950 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.176894 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.202110 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.224176 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.240801 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.273637 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.308530 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.368558 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.409767 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.420544 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.431367 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.442952 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.459389 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.472459 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.485068 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.495553 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.506335 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.522675 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.534737 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.546699 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:16 crc kubenswrapper[4753]: I1005 20:15:16.556560 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.035985 4753 generic.go:334] "Generic (PLEG): container finished" podID="5bbfd1eb-16b3-420d-acab-5770837c14fc" containerID="404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba" exitCode=0 Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.036066 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" event={"ID":"5bbfd1eb-16b3-420d-acab-5770837c14fc","Type":"ContainerDied","Data":"404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba"} Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.039973 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerStarted","Data":"80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d"} Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.039996 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerStarted","Data":"c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58"} Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.040011 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerStarted","Data":"ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243"} Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.040021 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerStarted","Data":"2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c"} Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.040030 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerStarted","Data":"f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4"} Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.056720 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.072812 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.095522 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.112267 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.132598 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.149120 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.165492 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.177874 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.187339 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.198411 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.208884 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.222603 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.233567 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.799617 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.801958 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.802009 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.802028 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.802126 4753 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.806723 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-v2pmn"] Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.807284 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v2pmn" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.811682 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.812008 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.813227 4753 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.813571 4753 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.814203 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.814976 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.815041 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.815059 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.815087 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.815112 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:17Z","lastTransitionTime":"2025-10-05T20:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.816334 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.827985 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: E1005 20:15:17.838992 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.842438 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.842488 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.842506 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.842529 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.842546 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:17Z","lastTransitionTime":"2025-10-05T20:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.851640 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.851711 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.851779 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:17 crc kubenswrapper[4753]: E1005 20:15:17.851857 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:17 crc kubenswrapper[4753]: E1005 20:15:17.852002 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:17 crc kubenswrapper[4753]: E1005 20:15:17.852176 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.852507 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: E1005 20:15:17.862671 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.867079 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.867126 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.867167 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.867191 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.867208 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:17Z","lastTransitionTime":"2025-10-05T20:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.875601 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: E1005 20:15:17.881356 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.886113 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.886152 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.886161 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.886176 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.886186 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:17Z","lastTransitionTime":"2025-10-05T20:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.890669 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.898554 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a89c4e01-7849-478a-bf2b-07701a1c5ef3-host\") pod \"node-ca-v2pmn\" (UID: \"a89c4e01-7849-478a-bf2b-07701a1c5ef3\") " pod="openshift-image-registry/node-ca-v2pmn" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.898689 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6chvc\" (UniqueName: \"kubernetes.io/projected/a89c4e01-7849-478a-bf2b-07701a1c5ef3-kube-api-access-6chvc\") pod \"node-ca-v2pmn\" (UID: \"a89c4e01-7849-478a-bf2b-07701a1c5ef3\") " pod="openshift-image-registry/node-ca-v2pmn" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.898844 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a89c4e01-7849-478a-bf2b-07701a1c5ef3-serviceca\") pod \"node-ca-v2pmn\" (UID: \"a89c4e01-7849-478a-bf2b-07701a1c5ef3\") " pod="openshift-image-registry/node-ca-v2pmn" Oct 05 20:15:17 crc kubenswrapper[4753]: E1005 20:15:17.899468 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.904435 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.904465 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.904473 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.904509 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.904520 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:17Z","lastTransitionTime":"2025-10-05T20:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.905117 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: E1005 20:15:17.919802 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: E1005 20:15:17.919911 4753 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.921550 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.921601 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.921615 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.921638 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.921650 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:17Z","lastTransitionTime":"2025-10-05T20:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.922771 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.942674 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.961697 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.974927 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.986099 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:17Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.999509 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a89c4e01-7849-478a-bf2b-07701a1c5ef3-serviceca\") pod \"node-ca-v2pmn\" (UID: \"a89c4e01-7849-478a-bf2b-07701a1c5ef3\") " pod="openshift-image-registry/node-ca-v2pmn" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.999610 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a89c4e01-7849-478a-bf2b-07701a1c5ef3-host\") pod \"node-ca-v2pmn\" (UID: \"a89c4e01-7849-478a-bf2b-07701a1c5ef3\") " pod="openshift-image-registry/node-ca-v2pmn" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.999657 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6chvc\" (UniqueName: \"kubernetes.io/projected/a89c4e01-7849-478a-bf2b-07701a1c5ef3-kube-api-access-6chvc\") pod \"node-ca-v2pmn\" (UID: \"a89c4e01-7849-478a-bf2b-07701a1c5ef3\") " pod="openshift-image-registry/node-ca-v2pmn" Oct 05 20:15:17 crc kubenswrapper[4753]: I1005 20:15:17.999763 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a89c4e01-7849-478a-bf2b-07701a1c5ef3-host\") pod \"node-ca-v2pmn\" (UID: \"a89c4e01-7849-478a-bf2b-07701a1c5ef3\") " pod="openshift-image-registry/node-ca-v2pmn" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.001075 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a89c4e01-7849-478a-bf2b-07701a1c5ef3-serviceca\") pod \"node-ca-v2pmn\" (UID: \"a89c4e01-7849-478a-bf2b-07701a1c5ef3\") " pod="openshift-image-registry/node-ca-v2pmn" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.003722 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.020129 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.024027 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.024067 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.024081 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.024098 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.024111 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:18Z","lastTransitionTime":"2025-10-05T20:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.032185 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6chvc\" (UniqueName: \"kubernetes.io/projected/a89c4e01-7849-478a-bf2b-07701a1c5ef3-kube-api-access-6chvc\") pod \"node-ca-v2pmn\" (UID: \"a89c4e01-7849-478a-bf2b-07701a1c5ef3\") " pod="openshift-image-registry/node-ca-v2pmn" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.035663 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.047086 4753 generic.go:334] "Generic (PLEG): container finished" podID="5bbfd1eb-16b3-420d-acab-5770837c14fc" containerID="6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192" exitCode=0 Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.047170 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" event={"ID":"5bbfd1eb-16b3-420d-acab-5770837c14fc","Type":"ContainerDied","Data":"6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192"} Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.057891 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerStarted","Data":"9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd"} Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.061725 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.085908 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.101865 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.116760 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.126647 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v2pmn" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.127555 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.127597 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.127612 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.127631 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.127643 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:18Z","lastTransitionTime":"2025-10-05T20:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.146049 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: W1005 20:15:18.146639 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda89c4e01_7849_478a_bf2b_07701a1c5ef3.slice/crio-adcedee4e804e71ee056d94fa15944e1cb2a0372b46e95fa6e80a1195ae0db37 WatchSource:0}: Error finding container adcedee4e804e71ee056d94fa15944e1cb2a0372b46e95fa6e80a1195ae0db37: Status 404 returned error can't find the container with id adcedee4e804e71ee056d94fa15944e1cb2a0372b46e95fa6e80a1195ae0db37 Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.158264 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.167361 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.179174 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.191074 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.201246 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.210614 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.218265 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.226388 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.234484 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.234593 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.234703 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.234805 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.234867 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:18Z","lastTransitionTime":"2025-10-05T20:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.244698 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.257455 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:18Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.336856 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.336886 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.336895 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.336910 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.336920 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:18Z","lastTransitionTime":"2025-10-05T20:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.439914 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.439965 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.439981 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.440004 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.440021 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:18Z","lastTransitionTime":"2025-10-05T20:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.543073 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.543109 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.543121 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.543162 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.543182 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:18Z","lastTransitionTime":"2025-10-05T20:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.554777 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.555345 4753 scope.go:117] "RemoveContainer" containerID="b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b" Oct 05 20:15:18 crc kubenswrapper[4753]: E1005 20:15:18.555575 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.645173 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.645211 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.645221 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.645236 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.645245 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:18Z","lastTransitionTime":"2025-10-05T20:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.747319 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.747370 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.747382 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.747409 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.747422 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:18Z","lastTransitionTime":"2025-10-05T20:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.849554 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.849594 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.849605 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.849645 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.849657 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:18Z","lastTransitionTime":"2025-10-05T20:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.951820 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.951860 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.951870 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.951887 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:18 crc kubenswrapper[4753]: I1005 20:15:18.951897 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:18Z","lastTransitionTime":"2025-10-05T20:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.053970 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.054003 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.054013 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.054028 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.054039 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:19Z","lastTransitionTime":"2025-10-05T20:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.063639 4753 generic.go:334] "Generic (PLEG): container finished" podID="5bbfd1eb-16b3-420d-acab-5770837c14fc" containerID="02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005" exitCode=0 Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.063723 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" event={"ID":"5bbfd1eb-16b3-420d-acab-5770837c14fc","Type":"ContainerDied","Data":"02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005"} Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.066298 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v2pmn" event={"ID":"a89c4e01-7849-478a-bf2b-07701a1c5ef3","Type":"ContainerStarted","Data":"cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97"} Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.066337 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v2pmn" event={"ID":"a89c4e01-7849-478a-bf2b-07701a1c5ef3","Type":"ContainerStarted","Data":"adcedee4e804e71ee056d94fa15944e1cb2a0372b46e95fa6e80a1195ae0db37"} Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.081087 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.092552 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.104624 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.120567 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.132624 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.147113 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.164943 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.164977 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.165617 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.165651 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.166203 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:19Z","lastTransitionTime":"2025-10-05T20:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.168727 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.181006 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.192331 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.206203 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.220295 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.236818 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.262729 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.269642 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.269687 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.269698 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.269716 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.269731 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:19Z","lastTransitionTime":"2025-10-05T20:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.282650 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.309674 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.339953 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.352545 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.366200 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.371294 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.371318 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.371326 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.371339 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.371347 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:19Z","lastTransitionTime":"2025-10-05T20:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.377445 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.386468 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.394884 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.407625 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.413209 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:15:19 crc kubenswrapper[4753]: E1005 20:15:19.413354 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:15:27.413335742 +0000 UTC m=+36.261663984 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.413426 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:19 crc kubenswrapper[4753]: E1005 20:15:19.413560 4753 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 05 20:15:19 crc kubenswrapper[4753]: E1005 20:15:19.413611 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:27.41360089 +0000 UTC m=+36.261929122 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.419304 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.431251 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.443428 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.453963 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.465542 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.473222 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.473264 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.473274 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.473290 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.473300 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:19Z","lastTransitionTime":"2025-10-05T20:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.483070 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.514582 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.514640 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.514659 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:19 crc kubenswrapper[4753]: E1005 20:15:19.514730 4753 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 05 20:15:19 crc kubenswrapper[4753]: E1005 20:15:19.514758 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 05 20:15:19 crc kubenswrapper[4753]: E1005 20:15:19.514771 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 05 20:15:19 crc kubenswrapper[4753]: E1005 20:15:19.514781 4753 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:19 crc kubenswrapper[4753]: E1005 20:15:19.514786 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 05 20:15:19 crc kubenswrapper[4753]: E1005 20:15:19.514806 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:27.514787586 +0000 UTC m=+36.363115828 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 05 20:15:19 crc kubenswrapper[4753]: E1005 20:15:19.514809 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 05 20:15:19 crc kubenswrapper[4753]: E1005 20:15:19.514823 4753 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:19 crc kubenswrapper[4753]: E1005 20:15:19.514824 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:27.514816297 +0000 UTC m=+36.363144539 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:19 crc kubenswrapper[4753]: E1005 20:15:19.514867 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:27.514854939 +0000 UTC m=+36.363183171 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.575747 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.575781 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.575789 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.575803 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.575812 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:19Z","lastTransitionTime":"2025-10-05T20:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.677654 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.677687 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.677698 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.677712 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.677720 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:19Z","lastTransitionTime":"2025-10-05T20:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.779745 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.779776 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.779784 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.779797 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.779806 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:19Z","lastTransitionTime":"2025-10-05T20:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.851913 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:19 crc kubenswrapper[4753]: E1005 20:15:19.852091 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.851913 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.852197 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:19 crc kubenswrapper[4753]: E1005 20:15:19.852293 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:19 crc kubenswrapper[4753]: E1005 20:15:19.852374 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.882039 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.882079 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.882090 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.882105 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.882118 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:19Z","lastTransitionTime":"2025-10-05T20:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.985498 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.985538 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.985553 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.985572 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:19 crc kubenswrapper[4753]: I1005 20:15:19.985588 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:19Z","lastTransitionTime":"2025-10-05T20:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.073917 4753 generic.go:334] "Generic (PLEG): container finished" podID="5bbfd1eb-16b3-420d-acab-5770837c14fc" containerID="29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51" exitCode=0 Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.073959 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" event={"ID":"5bbfd1eb-16b3-420d-acab-5770837c14fc","Type":"ContainerDied","Data":"29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51"} Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.079582 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerStarted","Data":"f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642"} Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.087844 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.087897 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.087914 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.087936 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.087952 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:20Z","lastTransitionTime":"2025-10-05T20:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.091028 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:20Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.108684 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:20Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.127168 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:20Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.142363 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:20Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.154295 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:20Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.165516 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:20Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.182337 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:20Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.190078 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.190117 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.190129 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.190159 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.190169 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:20Z","lastTransitionTime":"2025-10-05T20:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.196305 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:20Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.208449 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:20Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.219411 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:20Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.233422 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:20Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.248841 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:20Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.262437 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:20Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.274585 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:20Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.292460 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.292497 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.292508 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.292527 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.292541 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:20Z","lastTransitionTime":"2025-10-05T20:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.394935 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.394971 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.394980 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.394996 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.395005 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:20Z","lastTransitionTime":"2025-10-05T20:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.497461 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.497500 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.497510 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.497533 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.497550 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:20Z","lastTransitionTime":"2025-10-05T20:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.600696 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.600750 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.600760 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.600778 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.600790 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:20Z","lastTransitionTime":"2025-10-05T20:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.704193 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.704249 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.704270 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.704293 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.704306 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:20Z","lastTransitionTime":"2025-10-05T20:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.807471 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.807555 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.807579 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.807627 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.807683 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:20Z","lastTransitionTime":"2025-10-05T20:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.910429 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.910473 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.910481 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.910499 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:20 crc kubenswrapper[4753]: I1005 20:15:20.910510 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:20Z","lastTransitionTime":"2025-10-05T20:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.012651 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.012763 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.012780 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.012800 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.012811 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:21Z","lastTransitionTime":"2025-10-05T20:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.086298 4753 generic.go:334] "Generic (PLEG): container finished" podID="5bbfd1eb-16b3-420d-acab-5770837c14fc" containerID="ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694" exitCode=0 Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.086344 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" event={"ID":"5bbfd1eb-16b3-420d-acab-5770837c14fc","Type":"ContainerDied","Data":"ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694"} Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.100574 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.112739 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.115210 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.115245 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.115257 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.115276 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.115289 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:21Z","lastTransitionTime":"2025-10-05T20:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.123674 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.135817 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.148480 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.161222 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.173816 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.190958 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.207814 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.217391 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.217424 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.217434 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.217450 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.217460 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:21Z","lastTransitionTime":"2025-10-05T20:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.223869 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.245563 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.282362 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.305276 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.325465 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.325504 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.325515 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.325534 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.325547 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:21Z","lastTransitionTime":"2025-10-05T20:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.325582 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.428732 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.428772 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.428783 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.428797 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.428807 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:21Z","lastTransitionTime":"2025-10-05T20:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.531752 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.531791 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.531800 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.531818 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.531829 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:21Z","lastTransitionTime":"2025-10-05T20:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.635313 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.635545 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.635641 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.635731 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.636003 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:21Z","lastTransitionTime":"2025-10-05T20:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.739491 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.739552 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.739565 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.739585 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.739599 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:21Z","lastTransitionTime":"2025-10-05T20:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.842196 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.842237 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.842248 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.842263 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.842271 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:21Z","lastTransitionTime":"2025-10-05T20:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.851409 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.851509 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:21 crc kubenswrapper[4753]: E1005 20:15:21.851551 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.851640 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:21 crc kubenswrapper[4753]: E1005 20:15:21.851758 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:21 crc kubenswrapper[4753]: E1005 20:15:21.851897 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.864384 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.876188 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.889104 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.900441 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.911116 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.926005 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.943537 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.945383 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.945410 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.945420 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.945436 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.945447 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:21Z","lastTransitionTime":"2025-10-05T20:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.964997 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.979318 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:21 crc kubenswrapper[4753]: I1005 20:15:21.996053 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.009960 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.031630 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.046679 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.048222 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.048245 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.048253 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.048267 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.048278 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:22Z","lastTransitionTime":"2025-10-05T20:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.059360 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.091683 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerStarted","Data":"aa0b0807d77466420daf766e070ca5c804f5cc1cb2e6ac05cb9e5914ea920cb7"} Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.091920 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.098202 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" event={"ID":"5bbfd1eb-16b3-420d-acab-5770837c14fc","Type":"ContainerStarted","Data":"dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510"} Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.107878 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.121050 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.121455 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.134747 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.144702 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.151525 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.151551 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.151563 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.151582 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.151593 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:22Z","lastTransitionTime":"2025-10-05T20:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.156322 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.168664 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.183830 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.197076 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.209608 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.223108 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.241371 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b0807d77466420daf766e070ca5c804f5cc1cb2e6ac05cb9e5914ea920cb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.254796 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.254855 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.254868 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.254890 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.255247 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:22Z","lastTransitionTime":"2025-10-05T20:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.256404 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.270738 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.282842 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.297547 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.317981 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b0807d77466420daf766e070ca5c804f5cc1cb2e6ac05cb9e5914ea920cb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.332960 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.347709 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.357683 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.357766 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.357779 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.357846 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.357857 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:22Z","lastTransitionTime":"2025-10-05T20:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.361354 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.373998 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.386488 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.396157 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.407325 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.419373 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.428724 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.442842 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.458959 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.460425 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.460457 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.460469 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.460485 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.460496 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:22Z","lastTransitionTime":"2025-10-05T20:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.474087 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.563582 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.563632 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.563642 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.563660 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.563674 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:22Z","lastTransitionTime":"2025-10-05T20:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.665884 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.665932 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.665944 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.665963 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.665976 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:22Z","lastTransitionTime":"2025-10-05T20:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.768535 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.768749 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.768812 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.768876 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.768940 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:22Z","lastTransitionTime":"2025-10-05T20:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.871426 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.871645 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.871715 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.871790 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.871855 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:22Z","lastTransitionTime":"2025-10-05T20:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.974350 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.974414 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.974429 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.974449 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:22 crc kubenswrapper[4753]: I1005 20:15:22.974462 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:22Z","lastTransitionTime":"2025-10-05T20:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.077336 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.077404 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.077428 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.077459 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.077478 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:23Z","lastTransitionTime":"2025-10-05T20:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.102463 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.102511 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.127595 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.141856 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:23Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.154907 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:23Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.169027 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:23Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.180254 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.180404 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.180470 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.180556 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.180635 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:23Z","lastTransitionTime":"2025-10-05T20:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.184299 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:23Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.202739 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:23Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.214485 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:23Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.224502 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:23Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.236444 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:23Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.255753 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:23Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.272440 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:23Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.283751 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.283813 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.283836 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.283861 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.283877 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:23Z","lastTransitionTime":"2025-10-05T20:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.288616 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:23Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.303917 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:23Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.317692 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:23Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.345695 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b0807d77466420daf766e070ca5c804f5cc1cb2e6ac05cb9e5914ea920cb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:23Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.386672 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.386738 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.386757 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.386793 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.386815 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:23Z","lastTransitionTime":"2025-10-05T20:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.489765 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.489843 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.489856 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.489879 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.489891 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:23Z","lastTransitionTime":"2025-10-05T20:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.593102 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.593445 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.593544 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.593617 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.593675 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:23Z","lastTransitionTime":"2025-10-05T20:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.697185 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.697245 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.697261 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.697284 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.697297 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:23Z","lastTransitionTime":"2025-10-05T20:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.799858 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.799936 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.799956 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.799986 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.800003 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:23Z","lastTransitionTime":"2025-10-05T20:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.851163 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.851656 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:23 crc kubenswrapper[4753]: E1005 20:15:23.852615 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.852005 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:23 crc kubenswrapper[4753]: E1005 20:15:23.852963 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:23 crc kubenswrapper[4753]: E1005 20:15:23.851934 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.902297 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.902340 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.902349 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.902366 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:23 crc kubenswrapper[4753]: I1005 20:15:23.902379 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:23Z","lastTransitionTime":"2025-10-05T20:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.005170 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.005217 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.005233 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.005249 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.005259 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:24Z","lastTransitionTime":"2025-10-05T20:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.107215 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.107270 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.107285 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.107307 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.107321 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:24Z","lastTransitionTime":"2025-10-05T20:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.210064 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.210098 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.210106 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.210121 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.210131 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:24Z","lastTransitionTime":"2025-10-05T20:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.313269 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.313307 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.313317 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.313333 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.313345 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:24Z","lastTransitionTime":"2025-10-05T20:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.415753 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.415812 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.415832 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.415850 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.415861 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:24Z","lastTransitionTime":"2025-10-05T20:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.518440 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.518477 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.518488 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.518504 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.518513 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:24Z","lastTransitionTime":"2025-10-05T20:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.621323 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.621375 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.621392 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.621414 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.621431 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:24Z","lastTransitionTime":"2025-10-05T20:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.723942 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.723992 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.724003 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.724018 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.724033 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:24Z","lastTransitionTime":"2025-10-05T20:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.826862 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.826931 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.826947 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.826966 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.826979 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:24Z","lastTransitionTime":"2025-10-05T20:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.929382 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.929430 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.929447 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.929469 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:24 crc kubenswrapper[4753]: I1005 20:15:24.929485 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:24Z","lastTransitionTime":"2025-10-05T20:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.032567 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.032655 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.032681 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.032712 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.032733 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:25Z","lastTransitionTime":"2025-10-05T20:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.109654 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovnkube-controller/0.log" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.113670 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerID="aa0b0807d77466420daf766e070ca5c804f5cc1cb2e6ac05cb9e5914ea920cb7" exitCode=1 Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.113712 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerDied","Data":"aa0b0807d77466420daf766e070ca5c804f5cc1cb2e6ac05cb9e5914ea920cb7"} Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.115064 4753 scope.go:117] "RemoveContainer" containerID="aa0b0807d77466420daf766e070ca5c804f5cc1cb2e6ac05cb9e5914ea920cb7" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.125986 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:25Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.150340 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.150373 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.150387 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.150403 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.150415 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:25Z","lastTransitionTime":"2025-10-05T20:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.157049 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:25Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.193345 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:25Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.215334 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:25Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.233952 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:25Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.248541 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:25Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.253195 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.253236 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.253304 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.253430 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.253449 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:25Z","lastTransitionTime":"2025-10-05T20:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.259315 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:25Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.270742 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:25Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.283538 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:25Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.295722 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:25Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.315740 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0b0807d77466420daf766e070ca5c804f5cc1cb2e6ac05cb9e5914ea920cb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0b0807d77466420daf766e070ca5c804f5cc1cb2e6ac05cb9e5914ea920cb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:24Z\\\",\\\"message\\\":\\\"] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1005 20:15:24.344982 5962 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1005 20:15:24.345664 5962 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1005 20:15:24.345764 5962 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1005 20:15:24.345859 5962 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1005 20:15:24.345903 5962 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1005 20:15:24.345958 5962 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1005 20:15:24.346026 5962 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1005 20:15:24.346097 5962 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1005 20:15:24.346203 5962 factory.go:656] Stopping watch factory\\\\nI1005 20:15:24.346276 5962 ovnkube.go:599] Stopped ovnkube\\\\nI1005 20:15:24.346285 5962 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1005 20:15:24.346224 5962 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1005 20:15:24.346238 5962 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1005 20:15:24.346257 5962 handler.go:208] Removed *v1.Node event handler 7\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:25Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.337458 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:25Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.347938 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:25Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.355713 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.355735 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.355743 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.355757 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.355767 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:25Z","lastTransitionTime":"2025-10-05T20:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.359850 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:25Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.458031 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.458069 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.458083 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.458100 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.458111 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:25Z","lastTransitionTime":"2025-10-05T20:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.561016 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.561116 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.561132 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.561170 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.561185 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:25Z","lastTransitionTime":"2025-10-05T20:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.663359 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.663384 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.663392 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.663404 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.663413 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:25Z","lastTransitionTime":"2025-10-05T20:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.765567 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.765594 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.765604 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.765617 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.765625 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:25Z","lastTransitionTime":"2025-10-05T20:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.854875 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:25 crc kubenswrapper[4753]: E1005 20:15:25.855004 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.855381 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:25 crc kubenswrapper[4753]: E1005 20:15:25.855525 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.855736 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:25 crc kubenswrapper[4753]: E1005 20:15:25.855810 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.867992 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.868022 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.868033 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.868048 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.868058 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:25Z","lastTransitionTime":"2025-10-05T20:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.970529 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.970564 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.970572 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.970586 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:25 crc kubenswrapper[4753]: I1005 20:15:25.970597 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:25Z","lastTransitionTime":"2025-10-05T20:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.073053 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.073099 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.073111 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.073128 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.073179 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:26Z","lastTransitionTime":"2025-10-05T20:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.118877 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovnkube-controller/1.log" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.119585 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovnkube-controller/0.log" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.122268 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerID="7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c" exitCode=1 Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.122304 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerDied","Data":"7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c"} Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.122344 4753 scope.go:117] "RemoveContainer" containerID="aa0b0807d77466420daf766e070ca5c804f5cc1cb2e6ac05cb9e5914ea920cb7" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.123080 4753 scope.go:117] "RemoveContainer" containerID="7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c" Oct 05 20:15:26 crc kubenswrapper[4753]: E1005 20:15:26.123314 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.135674 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:26Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.150093 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:26Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.163325 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:26Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.175222 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.175246 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.175253 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.175266 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.175275 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:26Z","lastTransitionTime":"2025-10-05T20:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.177465 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:26Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.189964 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:26Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.201929 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:26Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.221838 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0b0807d77466420daf766e070ca5c804f5cc1cb2e6ac05cb9e5914ea920cb7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:24Z\\\",\\\"message\\\":\\\"] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1005 20:15:24.344982 5962 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1005 20:15:24.345664 5962 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1005 20:15:24.345764 5962 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1005 20:15:24.345859 5962 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1005 20:15:24.345903 5962 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1005 20:15:24.345958 5962 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1005 20:15:24.346026 5962 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1005 20:15:24.346097 5962 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1005 20:15:24.346203 5962 factory.go:656] Stopping watch factory\\\\nI1005 20:15:24.346276 5962 ovnkube.go:599] Stopped ovnkube\\\\nI1005 20:15:24.346285 5962 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1005 20:15:24.346224 5962 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1005 20:15:24.346238 5962 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1005 20:15:24.346257 5962 handler.go:208] Removed *v1.Node event handler 7\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:26Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:26.053978 6098 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:15:26.054003 6098 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current ti\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:26Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.234726 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:26Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.246639 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:26Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.258626 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:26Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.272114 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:26Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.277619 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.277650 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.277661 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.277677 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.277691 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:26Z","lastTransitionTime":"2025-10-05T20:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.290739 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:26Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.302071 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:26Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.312532 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:26Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.380994 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.381041 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.381053 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.381071 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.381084 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:26Z","lastTransitionTime":"2025-10-05T20:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.483339 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.483404 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.483423 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.483448 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.483467 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:26Z","lastTransitionTime":"2025-10-05T20:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.586032 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.586076 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.586087 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.586107 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.586119 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:26Z","lastTransitionTime":"2025-10-05T20:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.687839 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.687877 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.687889 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.687907 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.687918 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:26Z","lastTransitionTime":"2025-10-05T20:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.790960 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.790993 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.791003 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.791018 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.791027 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:26Z","lastTransitionTime":"2025-10-05T20:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.893251 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.893311 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.893329 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.893351 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.893368 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:26Z","lastTransitionTime":"2025-10-05T20:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.996112 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.996208 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.996227 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.996250 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:26 crc kubenswrapper[4753]: I1005 20:15:26.996269 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:26Z","lastTransitionTime":"2025-10-05T20:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.099507 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.099542 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.099551 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.099566 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.099579 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:27Z","lastTransitionTime":"2025-10-05T20:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.130226 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovnkube-controller/1.log" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.132811 4753 scope.go:117] "RemoveContainer" containerID="7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c" Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.132930 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.151585 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.160879 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l"] Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.161542 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.163276 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.163297 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.166624 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.176423 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.188199 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.198351 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.201762 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.201795 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.201806 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.201823 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.201835 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:27Z","lastTransitionTime":"2025-10-05T20:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.211126 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.218972 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.229831 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.243944 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.257738 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.269345 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.279956 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.290886 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.294130 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8eda6d8e-f029-46df-8965-bb8302a46cf7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-88v8l\" (UID: \"8eda6d8e-f029-46df-8965-bb8302a46cf7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.294169 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8eda6d8e-f029-46df-8965-bb8302a46cf7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-88v8l\" (UID: \"8eda6d8e-f029-46df-8965-bb8302a46cf7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.294188 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w95mm\" (UniqueName: \"kubernetes.io/projected/8eda6d8e-f029-46df-8965-bb8302a46cf7-kube-api-access-w95mm\") pod \"ovnkube-control-plane-749d76644c-88v8l\" (UID: \"8eda6d8e-f029-46df-8965-bb8302a46cf7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.294208 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8eda6d8e-f029-46df-8965-bb8302a46cf7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-88v8l\" (UID: \"8eda6d8e-f029-46df-8965-bb8302a46cf7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.303207 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.303241 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.303251 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.303265 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.303275 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:27Z","lastTransitionTime":"2025-10-05T20:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.316071 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:26Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:26.053978 6098 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:15:26.054003 6098 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current ti\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.333258 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.347394 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.360578 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.376442 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:26Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:26.053978 6098 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:15:26.054003 6098 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current ti\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.390455 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.395223 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8eda6d8e-f029-46df-8965-bb8302a46cf7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-88v8l\" (UID: \"8eda6d8e-f029-46df-8965-bb8302a46cf7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.395277 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8eda6d8e-f029-46df-8965-bb8302a46cf7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-88v8l\" (UID: \"8eda6d8e-f029-46df-8965-bb8302a46cf7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.395311 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w95mm\" (UniqueName: \"kubernetes.io/projected/8eda6d8e-f029-46df-8965-bb8302a46cf7-kube-api-access-w95mm\") pod \"ovnkube-control-plane-749d76644c-88v8l\" (UID: \"8eda6d8e-f029-46df-8965-bb8302a46cf7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.395353 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8eda6d8e-f029-46df-8965-bb8302a46cf7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-88v8l\" (UID: \"8eda6d8e-f029-46df-8965-bb8302a46cf7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.395877 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8eda6d8e-f029-46df-8965-bb8302a46cf7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-88v8l\" (UID: \"8eda6d8e-f029-46df-8965-bb8302a46cf7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.396315 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8eda6d8e-f029-46df-8965-bb8302a46cf7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-88v8l\" (UID: \"8eda6d8e-f029-46df-8965-bb8302a46cf7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.401029 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8eda6d8e-f029-46df-8965-bb8302a46cf7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-88v8l\" (UID: \"8eda6d8e-f029-46df-8965-bb8302a46cf7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.405778 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.405802 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.405810 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.405823 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.405832 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:27Z","lastTransitionTime":"2025-10-05T20:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.406429 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.418049 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w95mm\" (UniqueName: \"kubernetes.io/projected/8eda6d8e-f029-46df-8965-bb8302a46cf7-kube-api-access-w95mm\") pod \"ovnkube-control-plane-749d76644c-88v8l\" (UID: \"8eda6d8e-f029-46df-8965-bb8302a46cf7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.422511 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.439896 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.453192 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.465527 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.471879 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.483175 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.496515 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.496638 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:15:43.49662164 +0000 UTC m=+52.344949872 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.496740 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.496848 4753 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.496892 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:43.496884109 +0000 UTC m=+52.345212341 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.497462 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.509418 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.509448 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.509457 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.509470 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.509479 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:27Z","lastTransitionTime":"2025-10-05T20:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.510821 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.527817 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.539795 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:27Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.598540 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.598596 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.598629 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.598701 4753 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.598748 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.598766 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.598777 4753 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.598756 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:43.59874129 +0000 UTC m=+52.447069522 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.598812 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:43.598804083 +0000 UTC m=+52.447132315 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.598701 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.598826 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.598832 4753 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.598851 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:43.598844784 +0000 UTC m=+52.447173016 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.611677 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.611707 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.611717 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.611731 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.611739 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:27Z","lastTransitionTime":"2025-10-05T20:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.714383 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.714415 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.714426 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.714443 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.714454 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:27Z","lastTransitionTime":"2025-10-05T20:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.817102 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.817156 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.817168 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.817185 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.817195 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:27Z","lastTransitionTime":"2025-10-05T20:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.851367 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.851429 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.851519 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.851582 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.851674 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:27 crc kubenswrapper[4753]: E1005 20:15:27.851757 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.919813 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.919882 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.919895 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.919934 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:27 crc kubenswrapper[4753]: I1005 20:15:27.919947 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:27Z","lastTransitionTime":"2025-10-05T20:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.022286 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.022313 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.022321 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.022334 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.022343 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:28Z","lastTransitionTime":"2025-10-05T20:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.060191 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.060256 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.060278 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.060305 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.060326 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:28Z","lastTransitionTime":"2025-10-05T20:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:28 crc kubenswrapper[4753]: E1005 20:15:28.076411 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.080191 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.080258 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.080279 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.080309 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.080328 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:28Z","lastTransitionTime":"2025-10-05T20:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:28 crc kubenswrapper[4753]: E1005 20:15:28.097046 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.100490 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.100550 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.100571 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.100599 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.100618 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:28Z","lastTransitionTime":"2025-10-05T20:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:28 crc kubenswrapper[4753]: E1005 20:15:28.119131 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.122988 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.123014 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.123026 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.123038 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.123047 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:28Z","lastTransitionTime":"2025-10-05T20:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.137384 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" event={"ID":"8eda6d8e-f029-46df-8965-bb8302a46cf7","Type":"ContainerStarted","Data":"5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9"} Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.137462 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" event={"ID":"8eda6d8e-f029-46df-8965-bb8302a46cf7","Type":"ContainerStarted","Data":"886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e"} Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.137484 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" event={"ID":"8eda6d8e-f029-46df-8965-bb8302a46cf7","Type":"ContainerStarted","Data":"71ebba76d9c4561cccec938a53489e996ef16987b8a9ebc38e7ec508c4cbf78e"} Oct 05 20:15:28 crc kubenswrapper[4753]: E1005 20:15:28.145445 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.150045 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.150071 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.150082 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.150096 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.150106 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:28Z","lastTransitionTime":"2025-10-05T20:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.161677 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: E1005 20:15:28.168877 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: E1005 20:15:28.168986 4753 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.170564 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.170584 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.170592 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.170604 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.170613 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:28Z","lastTransitionTime":"2025-10-05T20:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.178539 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.197755 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.220323 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:26Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:26.053978 6098 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:15:26.054003 6098 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current ti\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.234363 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.252307 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.253390 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ktspr"] Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.253906 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:28 crc kubenswrapper[4753]: E1005 20:15:28.253974 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.261742 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.272742 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.272768 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.272777 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.272789 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.272798 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:28Z","lastTransitionTime":"2025-10-05T20:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.273408 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.284914 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.295889 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.303825 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs\") pod \"network-metrics-daemon-ktspr\" (UID: \"f99b8ef3-70ed-42e4-9217-a300fcd562d9\") " pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.303885 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7dct\" (UniqueName: \"kubernetes.io/projected/f99b8ef3-70ed-42e4-9217-a300fcd562d9-kube-api-access-t7dct\") pod \"network-metrics-daemon-ktspr\" (UID: \"f99b8ef3-70ed-42e4-9217-a300fcd562d9\") " pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.304700 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.316882 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.329079 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.342689 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.352884 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.364304 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.374955 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.375000 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.375049 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.375071 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.375067 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.375087 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:28Z","lastTransitionTime":"2025-10-05T20:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.388789 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.401463 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.404545 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs\") pod \"network-metrics-daemon-ktspr\" (UID: \"f99b8ef3-70ed-42e4-9217-a300fcd562d9\") " pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.404620 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7dct\" (UniqueName: \"kubernetes.io/projected/f99b8ef3-70ed-42e4-9217-a300fcd562d9-kube-api-access-t7dct\") pod \"network-metrics-daemon-ktspr\" (UID: \"f99b8ef3-70ed-42e4-9217-a300fcd562d9\") " pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:28 crc kubenswrapper[4753]: E1005 20:15:28.404932 4753 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 05 20:15:28 crc kubenswrapper[4753]: E1005 20:15:28.404988 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs podName:f99b8ef3-70ed-42e4-9217-a300fcd562d9 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:28.904973221 +0000 UTC m=+37.753301463 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs") pod "network-metrics-daemon-ktspr" (UID: "f99b8ef3-70ed-42e4-9217-a300fcd562d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.412452 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.422007 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.424449 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7dct\" (UniqueName: \"kubernetes.io/projected/f99b8ef3-70ed-42e4-9217-a300fcd562d9-kube-api-access-t7dct\") pod \"network-metrics-daemon-ktspr\" (UID: \"f99b8ef3-70ed-42e4-9217-a300fcd562d9\") " pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.441468 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.453460 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.464922 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.477724 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.477792 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.477810 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.477831 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.477867 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:28Z","lastTransitionTime":"2025-10-05T20:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.478524 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.492096 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.503271 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.515779 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.535664 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:26Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:26.053978 6098 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:15:26.054003 6098 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current ti\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.548354 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.561738 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:28Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.580436 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.580465 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.580476 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.580494 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.580506 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:28Z","lastTransitionTime":"2025-10-05T20:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.683399 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.683440 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.683472 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.683493 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.683507 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:28Z","lastTransitionTime":"2025-10-05T20:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.785494 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.785536 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.785552 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.785572 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.785586 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:28Z","lastTransitionTime":"2025-10-05T20:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.888179 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.888225 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.888240 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.888260 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.888274 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:28Z","lastTransitionTime":"2025-10-05T20:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.908588 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs\") pod \"network-metrics-daemon-ktspr\" (UID: \"f99b8ef3-70ed-42e4-9217-a300fcd562d9\") " pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:28 crc kubenswrapper[4753]: E1005 20:15:28.908784 4753 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 05 20:15:28 crc kubenswrapper[4753]: E1005 20:15:28.908891 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs podName:f99b8ef3-70ed-42e4-9217-a300fcd562d9 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:29.908850074 +0000 UTC m=+38.757178396 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs") pod "network-metrics-daemon-ktspr" (UID: "f99b8ef3-70ed-42e4-9217-a300fcd562d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.991756 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.991797 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.991809 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.991824 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:28 crc kubenswrapper[4753]: I1005 20:15:28.991835 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:28Z","lastTransitionTime":"2025-10-05T20:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.094885 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.094924 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.094934 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.094951 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.094964 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:29Z","lastTransitionTime":"2025-10-05T20:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.196944 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.196974 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.196982 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.196995 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.197004 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:29Z","lastTransitionTime":"2025-10-05T20:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.299409 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.299611 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.299619 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.299632 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.299642 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:29Z","lastTransitionTime":"2025-10-05T20:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.401490 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.401525 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.401535 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.401549 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.401558 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:29Z","lastTransitionTime":"2025-10-05T20:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.504195 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.504281 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.504298 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.504321 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.504337 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:29Z","lastTransitionTime":"2025-10-05T20:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.606085 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.606126 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.606161 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.606180 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.606192 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:29Z","lastTransitionTime":"2025-10-05T20:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.708305 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.708342 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.708352 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.708365 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.708375 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:29Z","lastTransitionTime":"2025-10-05T20:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.810917 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.810954 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.810963 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.810977 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.810987 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:29Z","lastTransitionTime":"2025-10-05T20:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.851561 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.851615 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.851564 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.851564 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:29 crc kubenswrapper[4753]: E1005 20:15:29.851729 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:29 crc kubenswrapper[4753]: E1005 20:15:29.851819 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:29 crc kubenswrapper[4753]: E1005 20:15:29.851972 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:29 crc kubenswrapper[4753]: E1005 20:15:29.852098 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.913200 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.913275 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.913288 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.913308 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.913342 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:29Z","lastTransitionTime":"2025-10-05T20:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:29 crc kubenswrapper[4753]: I1005 20:15:29.921578 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs\") pod \"network-metrics-daemon-ktspr\" (UID: \"f99b8ef3-70ed-42e4-9217-a300fcd562d9\") " pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:29 crc kubenswrapper[4753]: E1005 20:15:29.921779 4753 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 05 20:15:29 crc kubenswrapper[4753]: E1005 20:15:29.921863 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs podName:f99b8ef3-70ed-42e4-9217-a300fcd562d9 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:31.921842752 +0000 UTC m=+40.770171044 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs") pod "network-metrics-daemon-ktspr" (UID: "f99b8ef3-70ed-42e4-9217-a300fcd562d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.016396 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.016574 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.016605 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.016680 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.016707 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:30Z","lastTransitionTime":"2025-10-05T20:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.118960 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.119012 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.119021 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.119037 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.119046 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:30Z","lastTransitionTime":"2025-10-05T20:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.221552 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.221584 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.221592 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.221606 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.221615 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:30Z","lastTransitionTime":"2025-10-05T20:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.323799 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.323859 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.323870 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.323884 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.323903 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:30Z","lastTransitionTime":"2025-10-05T20:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.425796 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.425863 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.425879 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.425929 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.425945 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:30Z","lastTransitionTime":"2025-10-05T20:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.527708 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.527771 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.527784 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.527803 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.527837 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:30Z","lastTransitionTime":"2025-10-05T20:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.630470 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.630559 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.630576 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.630598 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.630671 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:30Z","lastTransitionTime":"2025-10-05T20:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.732813 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.732865 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.732875 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.732889 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.732897 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:30Z","lastTransitionTime":"2025-10-05T20:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.835598 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.835649 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.835661 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.835675 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.835685 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:30Z","lastTransitionTime":"2025-10-05T20:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.938228 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.938273 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.938285 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.938301 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:30 crc kubenswrapper[4753]: I1005 20:15:30.938310 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:30Z","lastTransitionTime":"2025-10-05T20:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.040767 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.041003 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.041016 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.041034 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.041049 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:31Z","lastTransitionTime":"2025-10-05T20:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.142982 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.143019 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.143030 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.143047 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.143061 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:31Z","lastTransitionTime":"2025-10-05T20:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.245407 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.245449 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.245460 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.245477 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.245489 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:31Z","lastTransitionTime":"2025-10-05T20:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.347630 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.347685 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.347695 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.347709 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.347718 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:31Z","lastTransitionTime":"2025-10-05T20:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.449699 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.449919 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.450038 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.450167 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.450271 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:31Z","lastTransitionTime":"2025-10-05T20:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.552445 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.552485 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.552496 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.552511 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.552521 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:31Z","lastTransitionTime":"2025-10-05T20:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.655388 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.655462 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.655474 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.655492 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.655502 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:31Z","lastTransitionTime":"2025-10-05T20:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.757990 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.758043 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.758056 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.758074 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.758087 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:31Z","lastTransitionTime":"2025-10-05T20:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.852039 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.852094 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.852094 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.852039 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:31 crc kubenswrapper[4753]: E1005 20:15:31.852306 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:31 crc kubenswrapper[4753]: E1005 20:15:31.852469 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:31 crc kubenswrapper[4753]: E1005 20:15:31.852559 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:31 crc kubenswrapper[4753]: E1005 20:15:31.852632 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.864374 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.864401 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.864409 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.864420 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.864428 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:31Z","lastTransitionTime":"2025-10-05T20:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.888404 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:31Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.919876 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:31Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.936547 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs\") pod \"network-metrics-daemon-ktspr\" (UID: \"f99b8ef3-70ed-42e4-9217-a300fcd562d9\") " pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:31 crc kubenswrapper[4753]: E1005 20:15:31.936716 4753 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 05 20:15:31 crc kubenswrapper[4753]: E1005 20:15:31.936821 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs podName:f99b8ef3-70ed-42e4-9217-a300fcd562d9 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:35.936792918 +0000 UTC m=+44.785121290 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs") pod "network-metrics-daemon-ktspr" (UID: "f99b8ef3-70ed-42e4-9217-a300fcd562d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.946071 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:31Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.966106 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.966342 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.966413 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.966486 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.966557 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:31Z","lastTransitionTime":"2025-10-05T20:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.968354 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:26Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:26.053978 6098 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:15:26.054003 6098 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current ti\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:31Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.979114 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:31Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.989435 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:31Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:31 crc kubenswrapper[4753]: I1005 20:15:31.997649 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:31Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.007536 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.018940 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.029619 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.038877 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.051680 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.061964 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.068781 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.068811 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.068822 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.068837 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.068846 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:32Z","lastTransitionTime":"2025-10-05T20:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.074876 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.090898 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.102691 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.170373 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.170398 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.170407 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.170420 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.170428 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:32Z","lastTransitionTime":"2025-10-05T20:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.272293 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.272328 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.272337 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.272350 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.272359 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:32Z","lastTransitionTime":"2025-10-05T20:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.374841 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.374939 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.374959 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.375020 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.375041 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:32Z","lastTransitionTime":"2025-10-05T20:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.479445 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.479515 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.479537 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.479571 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.479594 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:32Z","lastTransitionTime":"2025-10-05T20:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.582523 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.582589 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.582613 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.582644 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.582670 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:32Z","lastTransitionTime":"2025-10-05T20:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.685743 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.685822 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.685844 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.685872 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.685894 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:32Z","lastTransitionTime":"2025-10-05T20:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.789450 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.789532 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.789556 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.789585 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.789606 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:32Z","lastTransitionTime":"2025-10-05T20:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.852417 4753 scope.go:117] "RemoveContainer" containerID="b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.893303 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.893362 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.893383 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.893408 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.893427 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:32Z","lastTransitionTime":"2025-10-05T20:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.996203 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.996250 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.996269 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.996294 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:32 crc kubenswrapper[4753]: I1005 20:15:32.996312 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:32Z","lastTransitionTime":"2025-10-05T20:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.098702 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.098749 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.098763 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.098786 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.098803 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:33Z","lastTransitionTime":"2025-10-05T20:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.153364 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.155492 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780"} Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.155787 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.167312 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:33Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.179216 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:33Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.195095 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:33Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.201286 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.201518 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.201623 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.201743 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.201838 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:33Z","lastTransitionTime":"2025-10-05T20:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.213432 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:33Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.229633 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:33Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.240252 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:33Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.270015 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:33Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.288341 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:26Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:26.053978 6098 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:15:26.054003 6098 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current ti\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:33Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.302089 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:33Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.303377 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.303402 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.303410 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.303423 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.303434 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:33Z","lastTransitionTime":"2025-10-05T20:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.315470 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:33Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.325577 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:33Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.336366 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:33Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.346235 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:33Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.356006 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:33Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.365338 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:33Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.373770 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:33Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.407963 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.408024 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.408043 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.408064 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.408087 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:33Z","lastTransitionTime":"2025-10-05T20:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.510394 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.510429 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.510438 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.510451 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.510460 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:33Z","lastTransitionTime":"2025-10-05T20:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.612856 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.612895 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.612907 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.612922 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.612934 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:33Z","lastTransitionTime":"2025-10-05T20:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.715153 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.715197 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.715209 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.715225 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.715235 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:33Z","lastTransitionTime":"2025-10-05T20:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.817666 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.817693 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.817702 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.817715 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.817723 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:33Z","lastTransitionTime":"2025-10-05T20:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.851071 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:33 crc kubenswrapper[4753]: E1005 20:15:33.851221 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.851361 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.851605 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:33 crc kubenswrapper[4753]: E1005 20:15:33.851605 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.851638 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:33 crc kubenswrapper[4753]: E1005 20:15:33.851686 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:33 crc kubenswrapper[4753]: E1005 20:15:33.851747 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.919688 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.919729 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.919741 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.919757 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:33 crc kubenswrapper[4753]: I1005 20:15:33.919768 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:33Z","lastTransitionTime":"2025-10-05T20:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.022438 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.022471 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.022483 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.022499 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.022509 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:34Z","lastTransitionTime":"2025-10-05T20:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.124937 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.124977 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.124989 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.125006 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.125018 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:34Z","lastTransitionTime":"2025-10-05T20:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.227281 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.227308 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.227316 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.227329 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.227342 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:34Z","lastTransitionTime":"2025-10-05T20:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.330621 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.330660 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.330671 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.330689 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.330700 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:34Z","lastTransitionTime":"2025-10-05T20:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.433398 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.433445 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.433456 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.433473 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.433485 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:34Z","lastTransitionTime":"2025-10-05T20:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.536369 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.536402 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.536411 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.536424 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.536432 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:34Z","lastTransitionTime":"2025-10-05T20:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.638902 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.638954 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.638968 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.638990 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.639002 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:34Z","lastTransitionTime":"2025-10-05T20:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.741522 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.741558 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.741568 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.741583 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.741594 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:34Z","lastTransitionTime":"2025-10-05T20:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.844101 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.844149 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.844173 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.844193 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.844203 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:34Z","lastTransitionTime":"2025-10-05T20:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.946235 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.946277 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.946288 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.946303 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:34 crc kubenswrapper[4753]: I1005 20:15:34.946315 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:34Z","lastTransitionTime":"2025-10-05T20:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.047959 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.048004 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.048012 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.048026 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.048039 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:35Z","lastTransitionTime":"2025-10-05T20:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.150429 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.150457 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.150465 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.150477 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.150486 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:35Z","lastTransitionTime":"2025-10-05T20:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.252902 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.252935 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.252944 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.252957 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.252969 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:35Z","lastTransitionTime":"2025-10-05T20:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.355278 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.355314 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.355323 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.355338 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.355347 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:35Z","lastTransitionTime":"2025-10-05T20:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.457888 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.458130 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.458222 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.458312 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.458378 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:35Z","lastTransitionTime":"2025-10-05T20:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.560537 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.560580 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.560589 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.560602 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.560611 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:35Z","lastTransitionTime":"2025-10-05T20:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.662544 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.662571 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.662579 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.662592 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.662601 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:35Z","lastTransitionTime":"2025-10-05T20:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.764752 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.764795 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.764807 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.764824 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.764836 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:35Z","lastTransitionTime":"2025-10-05T20:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.851819 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:35 crc kubenswrapper[4753]: E1005 20:15:35.852367 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.851867 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:35 crc kubenswrapper[4753]: E1005 20:15:35.852744 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.851915 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:35 crc kubenswrapper[4753]: E1005 20:15:35.853093 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.851830 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:35 crc kubenswrapper[4753]: E1005 20:15:35.853459 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.867908 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.867961 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.867974 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.867993 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.868006 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:35Z","lastTransitionTime":"2025-10-05T20:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.970337 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.970397 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.970414 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.970440 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.970457 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:35Z","lastTransitionTime":"2025-10-05T20:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:35 crc kubenswrapper[4753]: I1005 20:15:35.972873 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs\") pod \"network-metrics-daemon-ktspr\" (UID: \"f99b8ef3-70ed-42e4-9217-a300fcd562d9\") " pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:35 crc kubenswrapper[4753]: E1005 20:15:35.973013 4753 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 05 20:15:35 crc kubenswrapper[4753]: E1005 20:15:35.973108 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs podName:f99b8ef3-70ed-42e4-9217-a300fcd562d9 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:43.973083708 +0000 UTC m=+52.821411980 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs") pod "network-metrics-daemon-ktspr" (UID: "f99b8ef3-70ed-42e4-9217-a300fcd562d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.073483 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.073522 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.073534 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.073553 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.073565 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:36Z","lastTransitionTime":"2025-10-05T20:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.175673 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.175899 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.175997 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.176092 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.176200 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:36Z","lastTransitionTime":"2025-10-05T20:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.278981 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.279022 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.279036 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.279054 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.279068 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:36Z","lastTransitionTime":"2025-10-05T20:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.381369 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.382072 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.382199 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.382328 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.382430 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:36Z","lastTransitionTime":"2025-10-05T20:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.485463 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.485529 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.485555 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.485585 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.485606 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:36Z","lastTransitionTime":"2025-10-05T20:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.588823 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.588880 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.588894 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.588915 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.588927 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:36Z","lastTransitionTime":"2025-10-05T20:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.692252 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.692596 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.692778 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.692965 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.693113 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:36Z","lastTransitionTime":"2025-10-05T20:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.796295 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.796350 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.796369 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.796394 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.796412 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:36Z","lastTransitionTime":"2025-10-05T20:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.899351 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.899638 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.899917 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.900100 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:36 crc kubenswrapper[4753]: I1005 20:15:36.900344 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:36Z","lastTransitionTime":"2025-10-05T20:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.003811 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.003875 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.003897 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.003931 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.003956 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:37Z","lastTransitionTime":"2025-10-05T20:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.107617 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.107682 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.107701 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.107729 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.107748 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:37Z","lastTransitionTime":"2025-10-05T20:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.211583 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.211926 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.212136 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.212403 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.212592 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:37Z","lastTransitionTime":"2025-10-05T20:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.315388 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.315481 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.315501 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.315524 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.315543 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:37Z","lastTransitionTime":"2025-10-05T20:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.418271 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.418303 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.418315 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.418331 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.418344 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:37Z","lastTransitionTime":"2025-10-05T20:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.520702 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.520734 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.520746 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.520763 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.520775 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:37Z","lastTransitionTime":"2025-10-05T20:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.623259 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.623578 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.623696 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.623784 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.624132 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:37Z","lastTransitionTime":"2025-10-05T20:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.726786 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.727070 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.727275 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.727497 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.727635 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:37Z","lastTransitionTime":"2025-10-05T20:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.830785 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.830880 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.830900 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.830949 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.830968 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:37Z","lastTransitionTime":"2025-10-05T20:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.852021 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:37 crc kubenswrapper[4753]: E1005 20:15:37.852146 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.852517 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:37 crc kubenswrapper[4753]: E1005 20:15:37.852607 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.852659 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:37 crc kubenswrapper[4753]: E1005 20:15:37.852716 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.852835 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:37 crc kubenswrapper[4753]: E1005 20:15:37.852895 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.934487 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.934550 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.934567 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.934591 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:37 crc kubenswrapper[4753]: I1005 20:15:37.934610 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:37Z","lastTransitionTime":"2025-10-05T20:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.037674 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.037746 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.037786 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.037818 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.037844 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:38Z","lastTransitionTime":"2025-10-05T20:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.141366 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.141906 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.142066 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.142247 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.142399 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:38Z","lastTransitionTime":"2025-10-05T20:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.245402 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.245467 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.245490 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.245525 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.245546 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:38Z","lastTransitionTime":"2025-10-05T20:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.347755 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.348204 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.348341 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.348482 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.348619 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:38Z","lastTransitionTime":"2025-10-05T20:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.452292 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.452428 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.452489 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.452518 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.452587 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:38Z","lastTransitionTime":"2025-10-05T20:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.496903 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.496984 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.497009 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.497040 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.497065 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:38Z","lastTransitionTime":"2025-10-05T20:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:38 crc kubenswrapper[4753]: E1005 20:15:38.522974 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:38Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.528704 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.528967 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.529184 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.529380 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.529569 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:38Z","lastTransitionTime":"2025-10-05T20:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:38 crc kubenswrapper[4753]: E1005 20:15:38.552571 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:38Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.558285 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.558340 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.558357 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.558378 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.558396 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:38Z","lastTransitionTime":"2025-10-05T20:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:38 crc kubenswrapper[4753]: E1005 20:15:38.579648 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:38Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.585125 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.585459 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.585679 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.585890 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.586085 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:38Z","lastTransitionTime":"2025-10-05T20:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:38 crc kubenswrapper[4753]: E1005 20:15:38.608916 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:38Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.613848 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.613905 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.613927 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.613957 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.613977 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:38Z","lastTransitionTime":"2025-10-05T20:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:38 crc kubenswrapper[4753]: E1005 20:15:38.634502 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:38Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:38 crc kubenswrapper[4753]: E1005 20:15:38.634762 4753 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.636877 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.636926 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.636941 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.636963 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.636995 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:38Z","lastTransitionTime":"2025-10-05T20:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.740336 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.740392 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.740410 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.740433 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.740449 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:38Z","lastTransitionTime":"2025-10-05T20:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.843015 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.843537 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.843749 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.843965 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.844226 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:38Z","lastTransitionTime":"2025-10-05T20:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.947812 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.947853 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.947863 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.947880 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:38 crc kubenswrapper[4753]: I1005 20:15:38.947889 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:38Z","lastTransitionTime":"2025-10-05T20:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.051997 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.052045 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.052057 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.052075 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.052085 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:39Z","lastTransitionTime":"2025-10-05T20:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.155084 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.155173 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.155195 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.155220 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.155307 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:39Z","lastTransitionTime":"2025-10-05T20:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.260577 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.261009 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.261256 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.261480 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.261677 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:39Z","lastTransitionTime":"2025-10-05T20:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.364320 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.364398 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.364423 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.364453 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.364479 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:39Z","lastTransitionTime":"2025-10-05T20:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.467771 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.467828 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.467848 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.467873 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.467951 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:39Z","lastTransitionTime":"2025-10-05T20:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.571213 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.571279 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.571291 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.571313 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.571328 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:39Z","lastTransitionTime":"2025-10-05T20:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.675097 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.675337 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.675479 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.675563 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.675670 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:39Z","lastTransitionTime":"2025-10-05T20:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.782477 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.782549 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.782561 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.782584 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.782642 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:39Z","lastTransitionTime":"2025-10-05T20:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.853989 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.854065 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.854024 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.854176 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:39 crc kubenswrapper[4753]: E1005 20:15:39.854366 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:39 crc kubenswrapper[4753]: E1005 20:15:39.854609 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:39 crc kubenswrapper[4753]: E1005 20:15:39.854757 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:39 crc kubenswrapper[4753]: E1005 20:15:39.854846 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.886443 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.886525 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.886539 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.886584 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.886597 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:39Z","lastTransitionTime":"2025-10-05T20:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.990282 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.990384 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.990398 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.990443 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:39 crc kubenswrapper[4753]: I1005 20:15:39.990457 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:39Z","lastTransitionTime":"2025-10-05T20:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.094010 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.094069 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.094090 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.094115 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.094134 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:40Z","lastTransitionTime":"2025-10-05T20:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.197353 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.197746 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.197909 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.198109 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.198303 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:40Z","lastTransitionTime":"2025-10-05T20:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.301113 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.301200 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.301214 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.301236 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.301250 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:40Z","lastTransitionTime":"2025-10-05T20:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.403871 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.403929 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.403942 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.403963 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.403975 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:40Z","lastTransitionTime":"2025-10-05T20:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.506886 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.506951 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.506974 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.507003 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.507024 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:40Z","lastTransitionTime":"2025-10-05T20:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.610284 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.610365 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.610400 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.610440 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.610458 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:40Z","lastTransitionTime":"2025-10-05T20:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.714548 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.714665 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.714684 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.714709 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.714727 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:40Z","lastTransitionTime":"2025-10-05T20:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.817859 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.817909 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.817919 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.817936 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.817948 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:40Z","lastTransitionTime":"2025-10-05T20:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.921745 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.921818 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.921837 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.921864 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:40 crc kubenswrapper[4753]: I1005 20:15:40.921883 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:40Z","lastTransitionTime":"2025-10-05T20:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.024764 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.024810 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.024824 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.024843 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.024854 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:41Z","lastTransitionTime":"2025-10-05T20:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.127491 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.127580 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.127610 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.127642 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.127667 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:41Z","lastTransitionTime":"2025-10-05T20:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.230868 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.230914 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.230928 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.230946 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.230957 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:41Z","lastTransitionTime":"2025-10-05T20:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.333438 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.333503 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.333521 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.333544 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.333561 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:41Z","lastTransitionTime":"2025-10-05T20:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.436522 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.436621 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.436653 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.436684 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.436711 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:41Z","lastTransitionTime":"2025-10-05T20:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.538389 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.538425 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.538440 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.538461 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.538479 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:41Z","lastTransitionTime":"2025-10-05T20:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.641919 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.641982 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.642001 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.642026 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.642045 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:41Z","lastTransitionTime":"2025-10-05T20:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.748415 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.748452 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.748464 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.748480 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.748492 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:41Z","lastTransitionTime":"2025-10-05T20:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.851334 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:41 crc kubenswrapper[4753]: E1005 20:15:41.851585 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.851636 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:41 crc kubenswrapper[4753]: E1005 20:15:41.851826 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.851952 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:41 crc kubenswrapper[4753]: E1005 20:15:41.852066 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.852269 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:41 crc kubenswrapper[4753]: E1005 20:15:41.852408 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.852688 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.852733 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.852809 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.852938 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.852964 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:41Z","lastTransitionTime":"2025-10-05T20:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.864178 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:41Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.877255 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:41Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.892919 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:41Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.910909 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:41Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.927585 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:41Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.943286 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:41Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.958844 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.958899 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.958921 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.958948 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.958994 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:41Z","lastTransitionTime":"2025-10-05T20:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.959033 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:41Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.978964 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:41Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:41 crc kubenswrapper[4753]: I1005 20:15:41.996167 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:41Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.013909 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.042168 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:26Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:26.053978 6098 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:15:26.054003 6098 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current ti\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.055920 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.063768 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.063792 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.063801 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.063813 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.063823 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:42Z","lastTransitionTime":"2025-10-05T20:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.069230 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.087831 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.099742 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.116375 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.166165 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.166192 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.166200 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.166213 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.166222 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:42Z","lastTransitionTime":"2025-10-05T20:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.219988 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.237957 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.242297 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.262940 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.268839 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.268884 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.268901 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.268924 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.268940 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:42Z","lastTransitionTime":"2025-10-05T20:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.311923 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.326838 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.344804 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.370054 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:26Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:26.053978 6098 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:15:26.054003 6098 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current ti\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.371887 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.371960 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.371979 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.372003 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.372019 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:42Z","lastTransitionTime":"2025-10-05T20:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.389769 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.409108 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.431737 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.447287 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.468123 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.475272 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.475352 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.475373 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.475407 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.475433 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:42Z","lastTransitionTime":"2025-10-05T20:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.485263 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.497578 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.511399 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.531652 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.548993 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.580031 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.580099 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.580111 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.580154 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.580169 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:42Z","lastTransitionTime":"2025-10-05T20:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.685007 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.685073 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.685090 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.685112 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.685125 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:42Z","lastTransitionTime":"2025-10-05T20:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.788352 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.788382 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.788390 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.788404 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.788411 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:42Z","lastTransitionTime":"2025-10-05T20:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.853122 4753 scope.go:117] "RemoveContainer" containerID="7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.895534 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.896069 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.896096 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.896128 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.896190 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:42Z","lastTransitionTime":"2025-10-05T20:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.999490 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.999525 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.999536 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:42 crc kubenswrapper[4753]: I1005 20:15:42.999553 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:42.999564 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:42Z","lastTransitionTime":"2025-10-05T20:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.101746 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.101795 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.101805 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.101819 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.101828 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:43Z","lastTransitionTime":"2025-10-05T20:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.192607 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovnkube-controller/1.log" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.195030 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerStarted","Data":"76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f"} Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.195504 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.206448 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.206486 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.206497 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.206516 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.206527 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:43Z","lastTransitionTime":"2025-10-05T20:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.212174 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.237564 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.266382 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.294387 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.308821 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.309070 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.309088 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.309106 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.309117 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:43Z","lastTransitionTime":"2025-10-05T20:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.314492 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.336477 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.356468 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.382352 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:26Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:26.053978 6098 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:15:26.054003 6098 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current ti\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.397113 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7b2637f-73a4-4dbb-b1f0-2ed149e34157\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3162645925f249750ae6ef0136b28459a949ab5097da9ac48841312a0c1cc43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf427d5366f0edf753e57bb958b0e30b93716e0d46779a5c6fcca2db6794c868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1629003efbc02c7e0f18f7960a8002d9834606d268109b5986ac3c9ef3d004b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.411094 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.411197 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.411212 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.411240 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.411256 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:43Z","lastTransitionTime":"2025-10-05T20:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.419615 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.439901 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.451648 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.467464 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.482424 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.495291 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.507413 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.513963 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.513992 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.514000 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.514015 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.514024 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:43Z","lastTransitionTime":"2025-10-05T20:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.520755 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:43Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.569067 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.569214 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.569444 4753 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.569541 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-05 20:16:15.56952086 +0000 UTC m=+84.417849092 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.570357 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:16:15.570334666 +0000 UTC m=+84.418662918 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.616902 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.616985 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.617003 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.617028 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.617042 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:43Z","lastTransitionTime":"2025-10-05T20:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.669803 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.669843 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.669863 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.669937 4753 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.669982 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-05 20:16:15.669969959 +0000 UTC m=+84.518298191 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.670171 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.670184 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.670195 4753 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.670218 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-05 20:16:15.670211997 +0000 UTC m=+84.518540229 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.670381 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.670397 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.670404 4753 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.670426 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-05 20:16:15.670420413 +0000 UTC m=+84.518748645 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.719963 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.720223 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.720303 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.720379 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.720455 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:43Z","lastTransitionTime":"2025-10-05T20:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.823839 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.824068 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.824175 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.824255 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.824335 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:43Z","lastTransitionTime":"2025-10-05T20:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.851910 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.852318 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.852389 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.852438 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.852293 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.852632 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.852805 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.852941 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.928111 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.928244 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.928264 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.928288 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.928304 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:43Z","lastTransitionTime":"2025-10-05T20:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:43 crc kubenswrapper[4753]: I1005 20:15:43.973682 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs\") pod \"network-metrics-daemon-ktspr\" (UID: \"f99b8ef3-70ed-42e4-9217-a300fcd562d9\") " pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.973916 4753 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 05 20:15:43 crc kubenswrapper[4753]: E1005 20:15:43.974056 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs podName:f99b8ef3-70ed-42e4-9217-a300fcd562d9 nodeName:}" failed. No retries permitted until 2025-10-05 20:15:59.974026399 +0000 UTC m=+68.822354631 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs") pod "network-metrics-daemon-ktspr" (UID: "f99b8ef3-70ed-42e4-9217-a300fcd562d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.030932 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.030978 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.030991 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.031008 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.031019 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:44Z","lastTransitionTime":"2025-10-05T20:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.134383 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.134461 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.134483 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.134513 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.134534 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:44Z","lastTransitionTime":"2025-10-05T20:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.200450 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovnkube-controller/2.log" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.201303 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovnkube-controller/1.log" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.204167 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerID="76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f" exitCode=1 Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.204204 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerDied","Data":"76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f"} Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.204239 4753 scope.go:117] "RemoveContainer" containerID="7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.204793 4753 scope.go:117] "RemoveContainer" containerID="76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f" Oct 05 20:15:44 crc kubenswrapper[4753]: E1005 20:15:44.204938 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.219548 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.232018 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.240322 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.240353 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.240362 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.240383 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.240391 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:44Z","lastTransitionTime":"2025-10-05T20:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.244591 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.255574 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.273815 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.286162 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.298219 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.316178 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.327588 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.340474 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.342945 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.342993 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.343005 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.343025 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.343037 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:44Z","lastTransitionTime":"2025-10-05T20:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.351871 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.363472 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.381921 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:26Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:26.053978 6098 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:15:26.054003 6098 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current ti\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:43Z\\\",\\\"message\\\":\\\"t:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825939 6325 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825356 6325 port_cache.go:96] port-cache(openshift-network-diagnostics_network-check-target-xd92c): added port \\\\u0026{name:openshift-network-diagnostics_network-check-target-xd92c uuid:61897e97-c771-4738-8709-09636387cb00 logicalSwitch:crc ips:[0xc0094d9bc0] mac:[10 88 10 217 0 4] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.4/23] and MAC: 0a:58:0a:d9:00:04\\\\nI1005 20:15:43.826133 6325 pods.go:252] [openshift-network-diagnostics/network-check-target-xd\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.392657 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7b2637f-73a4-4dbb-b1f0-2ed149e34157\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3162645925f249750ae6ef0136b28459a949ab5097da9ac48841312a0c1cc43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf427d5366f0edf753e57bb958b0e30b93716e0d46779a5c6fcca2db6794c868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1629003efbc02c7e0f18f7960a8002d9834606d268109b5986ac3c9ef3d004b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.407814 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.421189 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.441316 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.446162 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.446213 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.446245 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.446263 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.446275 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:44Z","lastTransitionTime":"2025-10-05T20:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.548430 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.548701 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.548783 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.548853 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.548909 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:44Z","lastTransitionTime":"2025-10-05T20:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.650953 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.651006 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.651015 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.651031 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.651055 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:44Z","lastTransitionTime":"2025-10-05T20:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.754852 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.754899 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.754915 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.754935 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.754946 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:44Z","lastTransitionTime":"2025-10-05T20:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.857873 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.857915 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.857942 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.857955 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.857965 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:44Z","lastTransitionTime":"2025-10-05T20:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.892488 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.907665 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.919714 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.930033 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.956242 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1ba77f38c8f855a849f59af8b30bc16a64eecf9d55389a03b46ce3e05e679c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:26Z\\\",\\\"message\\\":\\\"] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:26.053978 6098 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:15:26.054003 6098 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current ti\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:43Z\\\",\\\"message\\\":\\\"t:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825939 6325 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825356 6325 port_cache.go:96] port-cache(openshift-network-diagnostics_network-check-target-xd92c): added port \\\\u0026{name:openshift-network-diagnostics_network-check-target-xd92c uuid:61897e97-c771-4738-8709-09636387cb00 logicalSwitch:crc ips:[0xc0094d9bc0] mac:[10 88 10 217 0 4] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.4/23] and MAC: 0a:58:0a:d9:00:04\\\\nI1005 20:15:43.826133 6325 pods.go:252] [openshift-network-diagnostics/network-check-target-xd\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.960308 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.960350 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.960362 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.960379 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.960390 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:44Z","lastTransitionTime":"2025-10-05T20:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.970543 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7b2637f-73a4-4dbb-b1f0-2ed149e34157\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3162645925f249750ae6ef0136b28459a949ab5097da9ac48841312a0c1cc43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf427d5366f0edf753e57bb958b0e30b93716e0d46779a5c6fcca2db6794c868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1629003efbc02c7e0f18f7960a8002d9834606d268109b5986ac3c9ef3d004b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.981716 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:44 crc kubenswrapper[4753]: I1005 20:15:44.992929 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:44Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.002392 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.016564 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.032779 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.048548 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.062931 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.062985 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.063000 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.063015 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.063027 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:45Z","lastTransitionTime":"2025-10-05T20:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.063027 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.077035 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.093513 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.108465 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.122586 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.135242 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.166378 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.166416 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.166425 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.166440 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.166451 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:45Z","lastTransitionTime":"2025-10-05T20:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.215797 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovnkube-controller/2.log" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.220397 4753 scope.go:117] "RemoveContainer" containerID="76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f" Oct 05 20:15:45 crc kubenswrapper[4753]: E1005 20:15:45.220652 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.234779 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.250068 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7b2637f-73a4-4dbb-b1f0-2ed149e34157\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3162645925f249750ae6ef0136b28459a949ab5097da9ac48841312a0c1cc43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf427d5366f0edf753e57bb958b0e30b93716e0d46779a5c6fcca2db6794c868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1629003efbc02c7e0f18f7960a8002d9834606d268109b5986ac3c9ef3d004b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.262428 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.269843 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.269956 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.270057 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.270132 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.270176 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:45Z","lastTransitionTime":"2025-10-05T20:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.275330 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.287608 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.301742 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.320442 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.337838 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.354880 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.373285 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.373359 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.373372 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.373423 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.373438 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:45Z","lastTransitionTime":"2025-10-05T20:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.375414 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.393386 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.414895 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.433783 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.463253 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:43Z\\\",\\\"message\\\":\\\"t:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825939 6325 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825356 6325 port_cache.go:96] port-cache(openshift-network-diagnostics_network-check-target-xd92c): added port \\\\u0026{name:openshift-network-diagnostics_network-check-target-xd92c uuid:61897e97-c771-4738-8709-09636387cb00 logicalSwitch:crc ips:[0xc0094d9bc0] mac:[10 88 10 217 0 4] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.4/23] and MAC: 0a:58:0a:d9:00:04\\\\nI1005 20:15:43.826133 6325 pods.go:252] [openshift-network-diagnostics/network-check-target-xd\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.475378 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.475509 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.475574 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.475665 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.475725 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:45Z","lastTransitionTime":"2025-10-05T20:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.483947 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.499399 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.515181 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:45Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.578709 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.578774 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.578792 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.578819 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.578836 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:45Z","lastTransitionTime":"2025-10-05T20:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.681996 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.682059 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.682077 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.682102 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.682128 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:45Z","lastTransitionTime":"2025-10-05T20:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.785101 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.785167 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.785176 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.785195 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.785205 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:45Z","lastTransitionTime":"2025-10-05T20:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.852213 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.852257 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.852278 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:45 crc kubenswrapper[4753]: E1005 20:15:45.852360 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:45 crc kubenswrapper[4753]: E1005 20:15:45.852503 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:45 crc kubenswrapper[4753]: E1005 20:15:45.852591 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.852242 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:45 crc kubenswrapper[4753]: E1005 20:15:45.853110 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.888622 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.888958 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.889075 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.889256 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.889370 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:45Z","lastTransitionTime":"2025-10-05T20:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.992019 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.992444 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.992599 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.992831 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:45 crc kubenswrapper[4753]: I1005 20:15:45.992964 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:45Z","lastTransitionTime":"2025-10-05T20:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.095866 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.095928 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.095946 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.095972 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.095993 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:46Z","lastTransitionTime":"2025-10-05T20:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.199284 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.199343 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.199363 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.199387 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.199405 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:46Z","lastTransitionTime":"2025-10-05T20:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.302173 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.302209 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.302217 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.302232 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.302240 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:46Z","lastTransitionTime":"2025-10-05T20:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.405081 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.405391 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.405647 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.406017 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.406111 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:46Z","lastTransitionTime":"2025-10-05T20:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.513576 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.513642 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.513655 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.513681 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.513695 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:46Z","lastTransitionTime":"2025-10-05T20:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.616825 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.616867 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.616879 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.616897 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.616907 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:46Z","lastTransitionTime":"2025-10-05T20:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.720475 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.720533 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.720546 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.720567 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.720582 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:46Z","lastTransitionTime":"2025-10-05T20:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.824237 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.824310 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.824335 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.824367 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.824408 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:46Z","lastTransitionTime":"2025-10-05T20:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.927826 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.927905 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.927924 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.927948 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:46 crc kubenswrapper[4753]: I1005 20:15:46.927968 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:46Z","lastTransitionTime":"2025-10-05T20:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.031040 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.031108 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.031265 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.031294 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.031314 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:47Z","lastTransitionTime":"2025-10-05T20:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.134600 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.134676 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.134698 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.134721 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.134738 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:47Z","lastTransitionTime":"2025-10-05T20:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.237985 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.238056 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.238082 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.238115 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.238135 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:47Z","lastTransitionTime":"2025-10-05T20:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.341229 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.341297 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.341316 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.341340 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.341364 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:47Z","lastTransitionTime":"2025-10-05T20:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.444635 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.444706 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.444715 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.444738 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.444747 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:47Z","lastTransitionTime":"2025-10-05T20:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.548060 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.548132 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.548176 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.548207 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.548226 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:47Z","lastTransitionTime":"2025-10-05T20:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.651409 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.651524 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.651549 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.651752 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.651795 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:47Z","lastTransitionTime":"2025-10-05T20:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.755331 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.755392 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.755410 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.755436 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.755456 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:47Z","lastTransitionTime":"2025-10-05T20:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.851927 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.852051 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.852052 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:47 crc kubenswrapper[4753]: E1005 20:15:47.852130 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:47 crc kubenswrapper[4753]: E1005 20:15:47.852334 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:47 crc kubenswrapper[4753]: E1005 20:15:47.852960 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.853057 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:47 crc kubenswrapper[4753]: E1005 20:15:47.853276 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.862405 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.862698 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.862725 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.862761 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.862788 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:47Z","lastTransitionTime":"2025-10-05T20:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.966681 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.966757 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.966784 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.966819 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:47 crc kubenswrapper[4753]: I1005 20:15:47.966846 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:47Z","lastTransitionTime":"2025-10-05T20:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.070633 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.070714 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.070742 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.070780 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.070804 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:48Z","lastTransitionTime":"2025-10-05T20:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.175080 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.175214 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.175245 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.175287 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.175316 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:48Z","lastTransitionTime":"2025-10-05T20:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.279284 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.279345 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.279365 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.279392 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.279419 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:48Z","lastTransitionTime":"2025-10-05T20:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.383702 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.383775 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.383797 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.383829 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.383848 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:48Z","lastTransitionTime":"2025-10-05T20:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.486998 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.487047 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.487056 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.487081 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.487090 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:48Z","lastTransitionTime":"2025-10-05T20:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.589751 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.589849 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.589884 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.589923 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.589951 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:48Z","lastTransitionTime":"2025-10-05T20:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.693674 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.693739 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.693763 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.693796 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.693817 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:48Z","lastTransitionTime":"2025-10-05T20:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.727053 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.727106 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.727127 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.727186 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.727211 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:48Z","lastTransitionTime":"2025-10-05T20:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:48 crc kubenswrapper[4753]: E1005 20:15:48.750254 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:48Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.762717 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.764363 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.764401 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.764437 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.764461 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:48Z","lastTransitionTime":"2025-10-05T20:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:48 crc kubenswrapper[4753]: E1005 20:15:48.785993 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:48Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.791957 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.792011 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.792038 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.792071 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.792096 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:48Z","lastTransitionTime":"2025-10-05T20:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:48 crc kubenswrapper[4753]: E1005 20:15:48.815676 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:48Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.822108 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.822270 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.822305 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.822339 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.822364 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:48Z","lastTransitionTime":"2025-10-05T20:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:48 crc kubenswrapper[4753]: E1005 20:15:48.850053 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:48Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.856375 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.856460 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.856486 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.856522 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.856548 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:48Z","lastTransitionTime":"2025-10-05T20:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:48 crc kubenswrapper[4753]: E1005 20:15:48.880566 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:48Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:48 crc kubenswrapper[4753]: E1005 20:15:48.880886 4753 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.883726 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.883766 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.883779 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.883800 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.883816 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:48Z","lastTransitionTime":"2025-10-05T20:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.987333 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.987852 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.988043 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.988294 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:48 crc kubenswrapper[4753]: I1005 20:15:48.988561 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:48Z","lastTransitionTime":"2025-10-05T20:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.092649 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.092720 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.092739 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.092770 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.092795 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:49Z","lastTransitionTime":"2025-10-05T20:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.196708 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.196802 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.196830 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.196869 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.196899 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:49Z","lastTransitionTime":"2025-10-05T20:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.300246 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.300337 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.300361 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.300397 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.300422 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:49Z","lastTransitionTime":"2025-10-05T20:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.403913 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.403977 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.403995 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.404021 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.404041 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:49Z","lastTransitionTime":"2025-10-05T20:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.507570 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.507616 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.507627 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.507644 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.507655 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:49Z","lastTransitionTime":"2025-10-05T20:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.609869 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.609916 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.609931 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.609945 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.609955 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:49Z","lastTransitionTime":"2025-10-05T20:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.713659 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.714212 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.714355 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.714497 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.714654 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:49Z","lastTransitionTime":"2025-10-05T20:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.820915 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.820962 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.820980 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.821002 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.821024 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:49Z","lastTransitionTime":"2025-10-05T20:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.852015 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:49 crc kubenswrapper[4753]: E1005 20:15:49.852595 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.852950 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:49 crc kubenswrapper[4753]: E1005 20:15:49.853128 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.853382 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:49 crc kubenswrapper[4753]: E1005 20:15:49.853563 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.853452 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:49 crc kubenswrapper[4753]: E1005 20:15:49.853834 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.924270 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.924381 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.924407 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.924445 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:49 crc kubenswrapper[4753]: I1005 20:15:49.924476 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:49Z","lastTransitionTime":"2025-10-05T20:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.028121 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.028224 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.028248 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.028277 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.028299 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:50Z","lastTransitionTime":"2025-10-05T20:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.132648 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.132718 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.132737 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.132765 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.132785 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:50Z","lastTransitionTime":"2025-10-05T20:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.236896 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.236969 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.236993 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.237030 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.237053 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:50Z","lastTransitionTime":"2025-10-05T20:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.340222 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.340334 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.340374 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.340411 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.340438 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:50Z","lastTransitionTime":"2025-10-05T20:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.443959 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.444034 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.444055 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.444083 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.444102 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:50Z","lastTransitionTime":"2025-10-05T20:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.554360 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.554453 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.554482 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.554511 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.554532 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:50Z","lastTransitionTime":"2025-10-05T20:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.659569 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.659632 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.659647 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.659669 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.659682 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:50Z","lastTransitionTime":"2025-10-05T20:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.763040 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.763101 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.763120 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.763177 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.763197 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:50Z","lastTransitionTime":"2025-10-05T20:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.865323 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.865423 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.865444 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.865471 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.865488 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:50Z","lastTransitionTime":"2025-10-05T20:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.968996 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.969059 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.969080 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.969106 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:50 crc kubenswrapper[4753]: I1005 20:15:50.969123 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:50Z","lastTransitionTime":"2025-10-05T20:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.072568 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.072674 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.072734 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.072763 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.072816 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:51Z","lastTransitionTime":"2025-10-05T20:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.176058 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.176124 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.176163 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.176187 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.176208 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:51Z","lastTransitionTime":"2025-10-05T20:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.280802 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.280885 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.280905 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.280935 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.280953 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:51Z","lastTransitionTime":"2025-10-05T20:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.385189 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.385256 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.385273 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.385302 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.385320 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:51Z","lastTransitionTime":"2025-10-05T20:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.488916 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.488985 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.489006 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.489036 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.489055 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:51Z","lastTransitionTime":"2025-10-05T20:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.592573 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.592653 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.592674 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.592702 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.592720 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:51Z","lastTransitionTime":"2025-10-05T20:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.695899 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.696446 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.696673 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.696864 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.697065 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:51Z","lastTransitionTime":"2025-10-05T20:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.800677 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.800740 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.800759 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.800784 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.800802 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:51Z","lastTransitionTime":"2025-10-05T20:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.852039 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.852205 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.852238 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.852252 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:51 crc kubenswrapper[4753]: E1005 20:15:51.853203 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:51 crc kubenswrapper[4753]: E1005 20:15:51.853382 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:51 crc kubenswrapper[4753]: E1005 20:15:51.853610 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:51 crc kubenswrapper[4753]: E1005 20:15:51.853035 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.874684 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:51Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.892906 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:51Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.908930 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.909042 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.909067 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.909122 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.909311 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:51Z","lastTransitionTime":"2025-10-05T20:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.921063 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7b2637f-73a4-4dbb-b1f0-2ed149e34157\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3162645925f249750ae6ef0136b28459a949ab5097da9ac48841312a0c1cc43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf427d5366f0edf753e57bb958b0e30b93716e0d46779a5c6fcca2db6794c868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1629003efbc02c7e0f18f7960a8002d9834606d268109b5986ac3c9ef3d004b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:51Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.943016 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:51Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.960519 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:51Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.975941 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:51Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:51 crc kubenswrapper[4753]: I1005 20:15:51.989555 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:51Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.008052 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:52Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.013413 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.013474 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.013532 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.013564 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.013589 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:52Z","lastTransitionTime":"2025-10-05T20:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.030008 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:52Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.050519 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:52Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.070422 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:52Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.094638 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:52Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.112131 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:52Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.117038 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.117080 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.117091 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.117113 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.117126 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:52Z","lastTransitionTime":"2025-10-05T20:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.128602 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:52Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.156531 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:43Z\\\",\\\"message\\\":\\\"t:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825939 6325 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825356 6325 port_cache.go:96] port-cache(openshift-network-diagnostics_network-check-target-xd92c): added port \\\\u0026{name:openshift-network-diagnostics_network-check-target-xd92c uuid:61897e97-c771-4738-8709-09636387cb00 logicalSwitch:crc ips:[0xc0094d9bc0] mac:[10 88 10 217 0 4] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.4/23] and MAC: 0a:58:0a:d9:00:04\\\\nI1005 20:15:43.826133 6325 pods.go:252] [openshift-network-diagnostics/network-check-target-xd\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:52Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.173507 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:52Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.188385 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:52Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.220054 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.220126 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.220174 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.220194 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.220210 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:52Z","lastTransitionTime":"2025-10-05T20:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.322563 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.322594 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.322602 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.322615 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.322625 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:52Z","lastTransitionTime":"2025-10-05T20:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.424692 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.424724 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.424736 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.424751 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.424762 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:52Z","lastTransitionTime":"2025-10-05T20:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.526444 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.526471 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.526516 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.526532 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.526541 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:52Z","lastTransitionTime":"2025-10-05T20:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.629641 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.629672 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.629682 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.629718 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.629732 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:52Z","lastTransitionTime":"2025-10-05T20:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.732226 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.732274 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.732293 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.732318 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.732335 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:52Z","lastTransitionTime":"2025-10-05T20:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.835353 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.835414 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.835435 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.835463 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.835483 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:52Z","lastTransitionTime":"2025-10-05T20:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.937913 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.937961 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.937978 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.938004 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:52 crc kubenswrapper[4753]: I1005 20:15:52.938020 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:52Z","lastTransitionTime":"2025-10-05T20:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.041852 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.041883 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.041892 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.041904 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.041913 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:53Z","lastTransitionTime":"2025-10-05T20:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.145010 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.145066 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.145085 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.145128 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.145186 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:53Z","lastTransitionTime":"2025-10-05T20:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.248020 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.248058 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.248069 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.248082 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.248092 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:53Z","lastTransitionTime":"2025-10-05T20:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.351546 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.351626 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.351647 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.351677 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.351695 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:53Z","lastTransitionTime":"2025-10-05T20:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.454440 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.454478 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.454486 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.454500 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.454535 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:53Z","lastTransitionTime":"2025-10-05T20:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.557280 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.557345 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.557366 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.557390 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.557408 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:53Z","lastTransitionTime":"2025-10-05T20:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.660976 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.661048 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.661061 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.661081 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.661095 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:53Z","lastTransitionTime":"2025-10-05T20:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.764950 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.765016 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.765036 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.765063 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.765083 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:53Z","lastTransitionTime":"2025-10-05T20:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.851563 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.851654 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.851576 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:53 crc kubenswrapper[4753]: E1005 20:15:53.851926 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:53 crc kubenswrapper[4753]: E1005 20:15:53.851691 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.851677 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:53 crc kubenswrapper[4753]: E1005 20:15:53.852004 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:53 crc kubenswrapper[4753]: E1005 20:15:53.852123 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.868604 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.868673 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.868692 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.868717 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.868735 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:53Z","lastTransitionTime":"2025-10-05T20:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.972794 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.972870 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.972892 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.972916 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:53 crc kubenswrapper[4753]: I1005 20:15:53.972936 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:53Z","lastTransitionTime":"2025-10-05T20:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.075508 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.075560 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.075577 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.075597 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.075614 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:54Z","lastTransitionTime":"2025-10-05T20:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.179887 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.179938 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.179950 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.179969 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.179998 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:54Z","lastTransitionTime":"2025-10-05T20:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.282617 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.282660 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.282668 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.282681 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.282689 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:54Z","lastTransitionTime":"2025-10-05T20:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.385585 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.385639 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.385655 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.385672 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.385684 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:54Z","lastTransitionTime":"2025-10-05T20:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.488529 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.488669 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.488697 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.488721 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.488738 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:54Z","lastTransitionTime":"2025-10-05T20:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.591774 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.591808 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.591817 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.591830 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.591839 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:54Z","lastTransitionTime":"2025-10-05T20:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.695484 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.695538 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.695551 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.695573 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.695584 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:54Z","lastTransitionTime":"2025-10-05T20:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.798674 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.798706 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.798713 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.798728 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.798738 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:54Z","lastTransitionTime":"2025-10-05T20:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.902452 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.902649 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.902681 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.902770 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:54 crc kubenswrapper[4753]: I1005 20:15:54.902841 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:54Z","lastTransitionTime":"2025-10-05T20:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.007265 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.007341 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.007390 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.007422 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.007442 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:55Z","lastTransitionTime":"2025-10-05T20:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.110319 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.110369 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.110399 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.110425 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.110442 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:55Z","lastTransitionTime":"2025-10-05T20:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.213934 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.213999 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.214024 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.214050 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.214071 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:55Z","lastTransitionTime":"2025-10-05T20:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.317082 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.317168 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.317189 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.317214 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.317232 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:55Z","lastTransitionTime":"2025-10-05T20:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.421492 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.421557 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.421621 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.421654 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.421675 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:55Z","lastTransitionTime":"2025-10-05T20:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.525700 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.525764 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.525823 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.525924 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.525958 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:55Z","lastTransitionTime":"2025-10-05T20:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.629133 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.629235 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.629254 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.629278 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.629321 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:55Z","lastTransitionTime":"2025-10-05T20:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.732407 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.732476 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.732492 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.732517 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.732531 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:55Z","lastTransitionTime":"2025-10-05T20:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.836513 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.836569 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.836581 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.836603 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.836617 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:55Z","lastTransitionTime":"2025-10-05T20:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.851080 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.851166 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.851206 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:55 crc kubenswrapper[4753]: E1005 20:15:55.851258 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.851167 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:55 crc kubenswrapper[4753]: E1005 20:15:55.851389 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:55 crc kubenswrapper[4753]: E1005 20:15:55.851575 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:55 crc kubenswrapper[4753]: E1005 20:15:55.851647 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.939811 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.939892 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.939905 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.939950 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:55 crc kubenswrapper[4753]: I1005 20:15:55.939962 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:55Z","lastTransitionTime":"2025-10-05T20:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.042238 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.042296 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.042315 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.042341 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.042359 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:56Z","lastTransitionTime":"2025-10-05T20:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.144712 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.144996 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.145062 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.145126 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.145225 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:56Z","lastTransitionTime":"2025-10-05T20:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.247914 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.247972 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.247989 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.248017 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.248037 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:56Z","lastTransitionTime":"2025-10-05T20:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.350584 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.350644 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.350656 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.350681 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.350696 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:56Z","lastTransitionTime":"2025-10-05T20:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.453891 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.453993 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.454073 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.454157 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.454216 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:56Z","lastTransitionTime":"2025-10-05T20:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.556633 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.556822 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.556884 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.556941 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.556993 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:56Z","lastTransitionTime":"2025-10-05T20:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.659567 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.659622 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.659640 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.659667 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.659688 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:56Z","lastTransitionTime":"2025-10-05T20:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.762963 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.763012 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.763032 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.763057 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.763074 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:56Z","lastTransitionTime":"2025-10-05T20:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.865774 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.865814 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.865825 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.865839 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.865850 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:56Z","lastTransitionTime":"2025-10-05T20:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.968126 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.968195 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.968211 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.968231 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:56 crc kubenswrapper[4753]: I1005 20:15:56.968247 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:56Z","lastTransitionTime":"2025-10-05T20:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.073597 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.073652 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.073670 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.073691 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.073705 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:57Z","lastTransitionTime":"2025-10-05T20:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.176538 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.176600 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.176620 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.176651 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.176676 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:57Z","lastTransitionTime":"2025-10-05T20:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.279601 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.279675 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.279691 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.279710 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.279723 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:57Z","lastTransitionTime":"2025-10-05T20:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.382673 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.382730 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.382745 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.382770 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.382784 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:57Z","lastTransitionTime":"2025-10-05T20:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.485039 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.485095 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.485107 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.485131 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.485170 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:57Z","lastTransitionTime":"2025-10-05T20:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.587274 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.587316 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.587330 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.587346 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.587358 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:57Z","lastTransitionTime":"2025-10-05T20:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.689754 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.689988 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.690077 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.690165 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.690228 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:57Z","lastTransitionTime":"2025-10-05T20:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.792683 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.792734 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.792746 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.792767 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.792779 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:57Z","lastTransitionTime":"2025-10-05T20:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.851659 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.851681 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.851799 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:57 crc kubenswrapper[4753]: E1005 20:15:57.851945 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.852023 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:57 crc kubenswrapper[4753]: E1005 20:15:57.852243 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:57 crc kubenswrapper[4753]: E1005 20:15:57.852321 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:57 crc kubenswrapper[4753]: E1005 20:15:57.852387 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.895866 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.895906 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.895915 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.895928 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.895938 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:57Z","lastTransitionTime":"2025-10-05T20:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.998157 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.998207 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.998221 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.998244 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:57 crc kubenswrapper[4753]: I1005 20:15:57.998257 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:57Z","lastTransitionTime":"2025-10-05T20:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.100503 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.100559 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.100576 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.100602 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.100623 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:58Z","lastTransitionTime":"2025-10-05T20:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.203434 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.203509 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.203554 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.203588 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.203611 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:58Z","lastTransitionTime":"2025-10-05T20:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.306620 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.306674 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.306694 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.306754 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.306781 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:58Z","lastTransitionTime":"2025-10-05T20:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.409784 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.409836 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.409853 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.409878 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.409895 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:58Z","lastTransitionTime":"2025-10-05T20:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.511725 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.511765 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.511774 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.511791 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.511805 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:58Z","lastTransitionTime":"2025-10-05T20:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.614499 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.614540 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.614557 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.614579 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.614595 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:58Z","lastTransitionTime":"2025-10-05T20:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.716955 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.716998 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.717017 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.717040 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.717056 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:58Z","lastTransitionTime":"2025-10-05T20:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.819651 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.819699 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.819707 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.819724 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.819734 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:58Z","lastTransitionTime":"2025-10-05T20:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.923074 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.923115 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.923127 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.923159 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:58 crc kubenswrapper[4753]: I1005 20:15:58.923169 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:58Z","lastTransitionTime":"2025-10-05T20:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.025803 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.025836 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.025846 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.025859 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.025870 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:59Z","lastTransitionTime":"2025-10-05T20:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.128092 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.128159 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.128174 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.128192 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.128217 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:59Z","lastTransitionTime":"2025-10-05T20:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.164620 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.164712 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.164731 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.164757 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.164810 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:59Z","lastTransitionTime":"2025-10-05T20:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:59 crc kubenswrapper[4753]: E1005 20:15:59.181325 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:59Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.185556 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.185603 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.185629 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.185648 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.185659 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:59Z","lastTransitionTime":"2025-10-05T20:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:59 crc kubenswrapper[4753]: E1005 20:15:59.198739 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:59Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.203008 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.203049 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.203068 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.203088 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.203101 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:59Z","lastTransitionTime":"2025-10-05T20:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:59 crc kubenswrapper[4753]: E1005 20:15:59.217364 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:59Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.224348 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.224396 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.224408 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.224426 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.224438 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:59Z","lastTransitionTime":"2025-10-05T20:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:59 crc kubenswrapper[4753]: E1005 20:15:59.238901 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:59Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.243337 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.243378 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.243391 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.243416 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.243431 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:59Z","lastTransitionTime":"2025-10-05T20:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:59 crc kubenswrapper[4753]: E1005 20:15:59.257001 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:15:59Z is after 2025-08-24T17:21:41Z" Oct 05 20:15:59 crc kubenswrapper[4753]: E1005 20:15:59.257116 4753 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.258934 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.258970 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.258999 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.259019 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.259032 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:59Z","lastTransitionTime":"2025-10-05T20:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.362338 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.362398 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.362413 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.362437 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.362452 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:59Z","lastTransitionTime":"2025-10-05T20:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.465273 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.465381 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.465398 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.465421 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.465445 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:59Z","lastTransitionTime":"2025-10-05T20:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.568106 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.568194 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.568209 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.568261 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.568279 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:59Z","lastTransitionTime":"2025-10-05T20:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.671074 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.671115 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.671128 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.671171 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.671189 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:59Z","lastTransitionTime":"2025-10-05T20:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.774131 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.774194 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.774205 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.774227 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.774240 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:59Z","lastTransitionTime":"2025-10-05T20:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.854340 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:15:59 crc kubenswrapper[4753]: E1005 20:15:59.854482 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.854918 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:15:59 crc kubenswrapper[4753]: E1005 20:15:59.854972 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.855018 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:15:59 crc kubenswrapper[4753]: E1005 20:15:59.855063 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.855099 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:15:59 crc kubenswrapper[4753]: E1005 20:15:59.855154 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.877275 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.877323 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.877332 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.877351 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.877360 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:59Z","lastTransitionTime":"2025-10-05T20:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.981400 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.981441 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.981454 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.981472 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:15:59 crc kubenswrapper[4753]: I1005 20:15:59.981481 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:15:59Z","lastTransitionTime":"2025-10-05T20:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.056486 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs\") pod \"network-metrics-daemon-ktspr\" (UID: \"f99b8ef3-70ed-42e4-9217-a300fcd562d9\") " pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:00 crc kubenswrapper[4753]: E1005 20:16:00.056692 4753 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 05 20:16:00 crc kubenswrapper[4753]: E1005 20:16:00.056787 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs podName:f99b8ef3-70ed-42e4-9217-a300fcd562d9 nodeName:}" failed. No retries permitted until 2025-10-05 20:16:32.056765756 +0000 UTC m=+100.905093978 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs") pod "network-metrics-daemon-ktspr" (UID: "f99b8ef3-70ed-42e4-9217-a300fcd562d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.084856 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.084903 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.084913 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.084931 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.084942 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:00Z","lastTransitionTime":"2025-10-05T20:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.188480 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.188531 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.188541 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.188558 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.188573 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:00Z","lastTransitionTime":"2025-10-05T20:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.290919 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.290987 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.291003 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.291024 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.291035 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:00Z","lastTransitionTime":"2025-10-05T20:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.393860 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.393905 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.393938 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.393957 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.393969 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:00Z","lastTransitionTime":"2025-10-05T20:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.497466 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.497529 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.497548 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.497596 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.497631 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:00Z","lastTransitionTime":"2025-10-05T20:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.601050 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.601120 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.601229 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.601260 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.601278 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:00Z","lastTransitionTime":"2025-10-05T20:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.703995 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.704059 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.704075 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.704099 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.704116 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:00Z","lastTransitionTime":"2025-10-05T20:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.806568 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.806629 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.806642 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.806657 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.806667 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:00Z","lastTransitionTime":"2025-10-05T20:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.851851 4753 scope.go:117] "RemoveContainer" containerID="76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f" Oct 05 20:16:00 crc kubenswrapper[4753]: E1005 20:16:00.852077 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.909290 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.909351 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.909370 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.909395 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:00 crc kubenswrapper[4753]: I1005 20:16:00.909412 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:00Z","lastTransitionTime":"2025-10-05T20:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.012326 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.012429 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.012461 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.012500 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.012525 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:01Z","lastTransitionTime":"2025-10-05T20:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.114863 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.114914 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.114926 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.114947 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.114958 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:01Z","lastTransitionTime":"2025-10-05T20:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.217651 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.217699 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.217730 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.217749 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.217762 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:01Z","lastTransitionTime":"2025-10-05T20:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.321007 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.321082 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.321101 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.321128 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.321170 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:01Z","lastTransitionTime":"2025-10-05T20:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.424109 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.424209 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.424224 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.424246 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.424273 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:01Z","lastTransitionTime":"2025-10-05T20:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.527283 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.527323 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.527333 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.527348 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.527357 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:01Z","lastTransitionTime":"2025-10-05T20:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.628860 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.628919 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.628932 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.628948 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.628957 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:01Z","lastTransitionTime":"2025-10-05T20:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.731627 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.731668 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.731677 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.731694 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.731704 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:01Z","lastTransitionTime":"2025-10-05T20:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.834425 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.834468 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.834477 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.834492 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.834506 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:01Z","lastTransitionTime":"2025-10-05T20:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.851771 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.851791 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.851791 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:01 crc kubenswrapper[4753]: E1005 20:16:01.851902 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:01 crc kubenswrapper[4753]: E1005 20:16:01.851998 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.852020 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:01 crc kubenswrapper[4753]: E1005 20:16:01.852124 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:01 crc kubenswrapper[4753]: E1005 20:16:01.852237 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.880090 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:01Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.892524 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:01Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.906263 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:01Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.922984 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:43Z\\\",\\\"message\\\":\\\"t:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825939 6325 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825356 6325 port_cache.go:96] port-cache(openshift-network-diagnostics_network-check-target-xd92c): added port \\\\u0026{name:openshift-network-diagnostics_network-check-target-xd92c uuid:61897e97-c771-4738-8709-09636387cb00 logicalSwitch:crc ips:[0xc0094d9bc0] mac:[10 88 10 217 0 4] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.4/23] and MAC: 0a:58:0a:d9:00:04\\\\nI1005 20:15:43.826133 6325 pods.go:252] [openshift-network-diagnostics/network-check-target-xd\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:01Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.934510 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7b2637f-73a4-4dbb-b1f0-2ed149e34157\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3162645925f249750ae6ef0136b28459a949ab5097da9ac48841312a0c1cc43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf427d5366f0edf753e57bb958b0e30b93716e0d46779a5c6fcca2db6794c868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1629003efbc02c7e0f18f7960a8002d9834606d268109b5986ac3c9ef3d004b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:01Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.938294 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.938329 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.938339 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.938354 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.938363 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:01Z","lastTransitionTime":"2025-10-05T20:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.946288 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:01Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.961506 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:01Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.972133 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:01Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.983260 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:01Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:01 crc kubenswrapper[4753]: I1005 20:16:01.999681 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:01Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.013740 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.022798 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.030486 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.040370 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.040410 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.040480 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.040497 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.040506 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:02Z","lastTransitionTime":"2025-10-05T20:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.045652 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.059511 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.070172 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.082661 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.142603 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.142909 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.142919 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.142932 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.142940 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:02Z","lastTransitionTime":"2025-10-05T20:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.244930 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.244966 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.244977 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.244995 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.245006 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:02Z","lastTransitionTime":"2025-10-05T20:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.284780 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr5q8_8a6cead6-0872-4b49-a08c-529805f646f2/kube-multus/0.log" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.284849 4753 generic.go:334] "Generic (PLEG): container finished" podID="8a6cead6-0872-4b49-a08c-529805f646f2" containerID="a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039" exitCode=1 Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.284892 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr5q8" event={"ID":"8a6cead6-0872-4b49-a08c-529805f646f2","Type":"ContainerDied","Data":"a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039"} Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.285466 4753 scope.go:117] "RemoveContainer" containerID="a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.303799 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.327772 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.341449 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.347389 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.347414 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.347423 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.347436 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.347446 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:02Z","lastTransitionTime":"2025-10-05T20:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.362669 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:43Z\\\",\\\"message\\\":\\\"t:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825939 6325 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825356 6325 port_cache.go:96] port-cache(openshift-network-diagnostics_network-check-target-xd92c): added port \\\\u0026{name:openshift-network-diagnostics_network-check-target-xd92c uuid:61897e97-c771-4738-8709-09636387cb00 logicalSwitch:crc ips:[0xc0094d9bc0] mac:[10 88 10 217 0 4] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.4/23] and MAC: 0a:58:0a:d9:00:04\\\\nI1005 20:15:43.826133 6325 pods.go:252] [openshift-network-diagnostics/network-check-target-xd\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.381633 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7b2637f-73a4-4dbb-b1f0-2ed149e34157\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3162645925f249750ae6ef0136b28459a949ab5097da9ac48841312a0c1cc43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf427d5366f0edf753e57bb958b0e30b93716e0d46779a5c6fcca2db6794c868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1629003efbc02c7e0f18f7960a8002d9834606d268109b5986ac3c9ef3d004b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.393567 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.405467 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.414759 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.426059 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.448000 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.450219 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.450319 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.450519 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.450686 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.450753 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:02Z","lastTransitionTime":"2025-10-05T20:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.458117 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.467481 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.478534 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.489573 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.501985 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:16:02Z\\\",\\\"message\\\":\\\"2025-10-05T20:15:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b\\\\n2025-10-05T20:15:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b to /host/opt/cni/bin/\\\\n2025-10-05T20:15:17Z [verbose] multus-daemon started\\\\n2025-10-05T20:15:17Z [verbose] Readiness Indicator file check\\\\n2025-10-05T20:16:02Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.517578 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.528370 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:02Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.553433 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.553485 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.553499 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.553520 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.553532 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:02Z","lastTransitionTime":"2025-10-05T20:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.656075 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.656127 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.656155 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.656175 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.656187 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:02Z","lastTransitionTime":"2025-10-05T20:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.759617 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.759847 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.759927 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.759998 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.760066 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:02Z","lastTransitionTime":"2025-10-05T20:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.862601 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.862861 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.862934 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.863020 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.863096 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:02Z","lastTransitionTime":"2025-10-05T20:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.965352 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.965670 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.965740 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.965812 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:02 crc kubenswrapper[4753]: I1005 20:16:02.965876 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:02Z","lastTransitionTime":"2025-10-05T20:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.067768 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.067801 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.067810 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.067823 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.067833 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:03Z","lastTransitionTime":"2025-10-05T20:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.170205 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.170239 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.170248 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.170260 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.170268 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:03Z","lastTransitionTime":"2025-10-05T20:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.272446 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.272492 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.272503 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.272520 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.272535 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:03Z","lastTransitionTime":"2025-10-05T20:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.289167 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr5q8_8a6cead6-0872-4b49-a08c-529805f646f2/kube-multus/0.log" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.289223 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr5q8" event={"ID":"8a6cead6-0872-4b49-a08c-529805f646f2","Type":"ContainerStarted","Data":"d7ac7d15272eb1688a6a443c556a4ef57930c430084243bdff7df0d017044402"} Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.299952 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7b2637f-73a4-4dbb-b1f0-2ed149e34157\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3162645925f249750ae6ef0136b28459a949ab5097da9ac48841312a0c1cc43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf427d5366f0edf753e57bb958b0e30b93716e0d46779a5c6fcca2db6794c868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1629003efbc02c7e0f18f7960a8002d9834606d268109b5986ac3c9ef3d004b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.311414 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.322046 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.330463 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.341066 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.352381 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.360912 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.371362 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.374902 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.374955 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.374970 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.374987 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.374998 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:03Z","lastTransitionTime":"2025-10-05T20:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.381399 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.391056 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.401894 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.413729 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7ac7d15272eb1688a6a443c556a4ef57930c430084243bdff7df0d017044402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:16:02Z\\\",\\\"message\\\":\\\"2025-10-05T20:15:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b\\\\n2025-10-05T20:15:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b to /host/opt/cni/bin/\\\\n2025-10-05T20:15:17Z [verbose] multus-daemon started\\\\n2025-10-05T20:15:17Z [verbose] Readiness Indicator file check\\\\n2025-10-05T20:16:02Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:16:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.426988 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.439024 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.449347 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.459275 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.474068 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:43Z\\\",\\\"message\\\":\\\"t:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825939 6325 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825356 6325 port_cache.go:96] port-cache(openshift-network-diagnostics_network-check-target-xd92c): added port \\\\u0026{name:openshift-network-diagnostics_network-check-target-xd92c uuid:61897e97-c771-4738-8709-09636387cb00 logicalSwitch:crc ips:[0xc0094d9bc0] mac:[10 88 10 217 0 4] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.4/23] and MAC: 0a:58:0a:d9:00:04\\\\nI1005 20:15:43.826133 6325 pods.go:252] [openshift-network-diagnostics/network-check-target-xd\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:03Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.477266 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.477299 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.477309 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.477323 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.477333 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:03Z","lastTransitionTime":"2025-10-05T20:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.579284 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.579324 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.579336 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.579352 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.579363 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:03Z","lastTransitionTime":"2025-10-05T20:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.681081 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.681169 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.681181 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.681200 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.681214 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:03Z","lastTransitionTime":"2025-10-05T20:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.783038 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.783080 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.783090 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.783104 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.783112 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:03Z","lastTransitionTime":"2025-10-05T20:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.852302 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.852345 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.852366 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:03 crc kubenswrapper[4753]: E1005 20:16:03.852428 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.852460 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:03 crc kubenswrapper[4753]: E1005 20:16:03.852504 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:03 crc kubenswrapper[4753]: E1005 20:16:03.852545 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:03 crc kubenswrapper[4753]: E1005 20:16:03.852740 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.867029 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.884906 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.884940 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.884949 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.884963 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.884973 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:03Z","lastTransitionTime":"2025-10-05T20:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.986865 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.986898 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.986905 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.986935 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:03 crc kubenswrapper[4753]: I1005 20:16:03.986944 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:03Z","lastTransitionTime":"2025-10-05T20:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.089023 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.089066 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.089078 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.089095 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.089106 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:04Z","lastTransitionTime":"2025-10-05T20:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.191392 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.191432 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.191441 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.191454 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.191463 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:04Z","lastTransitionTime":"2025-10-05T20:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.293042 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.293084 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.293096 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.293114 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.293128 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:04Z","lastTransitionTime":"2025-10-05T20:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.395824 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.395865 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.395873 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.395888 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.395896 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:04Z","lastTransitionTime":"2025-10-05T20:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.499310 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.499344 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.499353 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.499367 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.499376 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:04Z","lastTransitionTime":"2025-10-05T20:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.601868 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.601897 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.601905 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.601918 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.601927 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:04Z","lastTransitionTime":"2025-10-05T20:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.704604 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.704655 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.704668 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.704687 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.704700 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:04Z","lastTransitionTime":"2025-10-05T20:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.807039 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.807086 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.807098 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.807113 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.807125 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:04Z","lastTransitionTime":"2025-10-05T20:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.909104 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.909161 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.909173 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.909187 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:04 crc kubenswrapper[4753]: I1005 20:16:04.909199 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:04Z","lastTransitionTime":"2025-10-05T20:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.011420 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.011463 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.011475 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.011493 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.011504 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:05Z","lastTransitionTime":"2025-10-05T20:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.113247 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.113292 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.113305 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.113322 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.113350 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:05Z","lastTransitionTime":"2025-10-05T20:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.214909 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.214955 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.214966 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.214979 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.214987 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:05Z","lastTransitionTime":"2025-10-05T20:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.316571 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.316622 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.316638 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.316653 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.316663 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:05Z","lastTransitionTime":"2025-10-05T20:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.418734 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.418780 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.418789 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.418805 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.418813 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:05Z","lastTransitionTime":"2025-10-05T20:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.520635 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.520674 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.520682 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.520696 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.520705 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:05Z","lastTransitionTime":"2025-10-05T20:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.623059 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.623099 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.623109 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.623123 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.623132 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:05Z","lastTransitionTime":"2025-10-05T20:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.725449 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.725489 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.725501 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.725517 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.725526 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:05Z","lastTransitionTime":"2025-10-05T20:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.828088 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.828126 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.828160 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.828176 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.828186 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:05Z","lastTransitionTime":"2025-10-05T20:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.851424 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.851429 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:05 crc kubenswrapper[4753]: E1005 20:16:05.851589 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.851455 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.851456 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:05 crc kubenswrapper[4753]: E1005 20:16:05.851673 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:05 crc kubenswrapper[4753]: E1005 20:16:05.851763 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:05 crc kubenswrapper[4753]: E1005 20:16:05.851827 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.930178 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.930489 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.930604 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.930741 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:05 crc kubenswrapper[4753]: I1005 20:16:05.930826 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:05Z","lastTransitionTime":"2025-10-05T20:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.033035 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.033353 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.033526 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.033643 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.033748 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:06Z","lastTransitionTime":"2025-10-05T20:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.135619 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.135648 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.135656 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.135669 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.135677 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:06Z","lastTransitionTime":"2025-10-05T20:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.238053 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.238304 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.238376 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.238465 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.238551 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:06Z","lastTransitionTime":"2025-10-05T20:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.340836 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.341337 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.341401 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.341482 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.341563 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:06Z","lastTransitionTime":"2025-10-05T20:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.444709 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.444741 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.444748 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.444763 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.444771 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:06Z","lastTransitionTime":"2025-10-05T20:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.547503 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.547543 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.547556 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.547572 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.547584 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:06Z","lastTransitionTime":"2025-10-05T20:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.649913 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.650360 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.650568 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.650826 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.651072 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:06Z","lastTransitionTime":"2025-10-05T20:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.753808 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.753867 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.753884 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.753911 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.753928 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:06Z","lastTransitionTime":"2025-10-05T20:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.856995 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.857043 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.857052 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.857064 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.857073 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:06Z","lastTransitionTime":"2025-10-05T20:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.958821 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.958866 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.958876 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.958891 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:06 crc kubenswrapper[4753]: I1005 20:16:06.958903 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:06Z","lastTransitionTime":"2025-10-05T20:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.061569 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.061608 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.061619 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.061639 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.061651 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:07Z","lastTransitionTime":"2025-10-05T20:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.164723 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.164925 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.164952 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.164978 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.164996 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:07Z","lastTransitionTime":"2025-10-05T20:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.267954 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.268038 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.268058 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.268087 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.268108 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:07Z","lastTransitionTime":"2025-10-05T20:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.370687 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.370794 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.370813 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.370829 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.370840 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:07Z","lastTransitionTime":"2025-10-05T20:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.473207 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.473243 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.473252 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.473268 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.473280 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:07Z","lastTransitionTime":"2025-10-05T20:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.575736 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.575796 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.575814 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.575841 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.575860 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:07Z","lastTransitionTime":"2025-10-05T20:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.679281 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.679343 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.679359 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.679382 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.679398 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:07Z","lastTransitionTime":"2025-10-05T20:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.783834 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.783894 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.783911 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.783937 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.783955 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:07Z","lastTransitionTime":"2025-10-05T20:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.851590 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.851639 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.851738 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.851747 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:07 crc kubenswrapper[4753]: E1005 20:16:07.851882 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:07 crc kubenswrapper[4753]: E1005 20:16:07.852102 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:07 crc kubenswrapper[4753]: E1005 20:16:07.852223 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:07 crc kubenswrapper[4753]: E1005 20:16:07.852299 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.886773 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.887239 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.887431 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.887669 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.887817 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:07Z","lastTransitionTime":"2025-10-05T20:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.991792 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.991867 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.991887 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.991918 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:07 crc kubenswrapper[4753]: I1005 20:16:07.991940 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:07Z","lastTransitionTime":"2025-10-05T20:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.095073 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.095175 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.095194 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.095217 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.095232 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:08Z","lastTransitionTime":"2025-10-05T20:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.198256 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.198338 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.198362 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.198392 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.198415 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:08Z","lastTransitionTime":"2025-10-05T20:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.302114 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.302166 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.302174 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.302188 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.302196 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:08Z","lastTransitionTime":"2025-10-05T20:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.406106 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.406181 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.406193 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.406214 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.406228 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:08Z","lastTransitionTime":"2025-10-05T20:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.510499 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.510564 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.510585 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.510614 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.510635 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:08Z","lastTransitionTime":"2025-10-05T20:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.613737 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.613812 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.613835 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.613870 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.613893 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:08Z","lastTransitionTime":"2025-10-05T20:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.718346 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.718423 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.718447 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.718475 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.718495 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:08Z","lastTransitionTime":"2025-10-05T20:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.821529 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.821599 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.821619 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.821648 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.821667 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:08Z","lastTransitionTime":"2025-10-05T20:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.924709 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.924778 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.924798 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.924828 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:08 crc kubenswrapper[4753]: I1005 20:16:08.924848 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:08Z","lastTransitionTime":"2025-10-05T20:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.027413 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.027489 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.027507 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.027538 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.027559 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:09Z","lastTransitionTime":"2025-10-05T20:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.132550 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.132631 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.132650 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.132680 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.132701 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:09Z","lastTransitionTime":"2025-10-05T20:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.236381 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.236447 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.236463 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.236486 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.236502 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:09Z","lastTransitionTime":"2025-10-05T20:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.260453 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.260486 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.260495 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.260509 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.260518 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:09Z","lastTransitionTime":"2025-10-05T20:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:09 crc kubenswrapper[4753]: E1005 20:16:09.273885 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:09Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.277287 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.277319 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.277330 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.277345 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.277356 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:09Z","lastTransitionTime":"2025-10-05T20:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:09 crc kubenswrapper[4753]: E1005 20:16:09.289873 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:09Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.294210 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.294331 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.294347 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.294364 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.294377 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:09Z","lastTransitionTime":"2025-10-05T20:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:09 crc kubenswrapper[4753]: E1005 20:16:09.307885 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:09Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.311534 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.311590 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.311602 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.311617 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.311628 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:09Z","lastTransitionTime":"2025-10-05T20:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:09 crc kubenswrapper[4753]: E1005 20:16:09.324576 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:09Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.327682 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.327722 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.327733 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.327748 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.327762 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:09Z","lastTransitionTime":"2025-10-05T20:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:09 crc kubenswrapper[4753]: E1005 20:16:09.339354 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:09Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:09 crc kubenswrapper[4753]: E1005 20:16:09.339457 4753 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.340709 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.340747 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.340755 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.340771 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.340782 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:09Z","lastTransitionTime":"2025-10-05T20:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.442844 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.442887 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.442897 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.442910 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.442919 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:09Z","lastTransitionTime":"2025-10-05T20:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.547063 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.547178 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.547202 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.547229 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.547251 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:09Z","lastTransitionTime":"2025-10-05T20:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.649611 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.649646 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.649677 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.649691 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.649700 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:09Z","lastTransitionTime":"2025-10-05T20:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.751662 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.751700 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.751711 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.751726 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.751737 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:09Z","lastTransitionTime":"2025-10-05T20:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.851931 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.851940 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.852034 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:09 crc kubenswrapper[4753]: E1005 20:16:09.852066 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.852398 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:09 crc kubenswrapper[4753]: E1005 20:16:09.852404 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:09 crc kubenswrapper[4753]: E1005 20:16:09.852396 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:09 crc kubenswrapper[4753]: E1005 20:16:09.852616 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.856103 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.856179 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.856197 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.856219 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.856240 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:09Z","lastTransitionTime":"2025-10-05T20:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.958198 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.958226 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.958235 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.958252 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:09 crc kubenswrapper[4753]: I1005 20:16:09.958263 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:09Z","lastTransitionTime":"2025-10-05T20:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.060943 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.061005 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.061029 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.061057 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.061077 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:10Z","lastTransitionTime":"2025-10-05T20:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.163114 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.163195 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.163205 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.163221 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.163230 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:10Z","lastTransitionTime":"2025-10-05T20:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.265743 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.265795 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.265809 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.265828 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.265843 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:10Z","lastTransitionTime":"2025-10-05T20:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.368655 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.368713 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.368725 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.368743 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.368755 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:10Z","lastTransitionTime":"2025-10-05T20:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.471442 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.471498 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.471516 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.471540 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.471556 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:10Z","lastTransitionTime":"2025-10-05T20:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.573643 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.573717 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.573740 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.573769 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.573797 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:10Z","lastTransitionTime":"2025-10-05T20:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.676677 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.676737 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.676749 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.676767 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.676781 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:10Z","lastTransitionTime":"2025-10-05T20:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.779325 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.779369 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.779381 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.779401 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.779413 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:10Z","lastTransitionTime":"2025-10-05T20:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.881191 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.881222 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.881231 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.881243 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.881255 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:10Z","lastTransitionTime":"2025-10-05T20:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.983643 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.983683 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.983697 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.983712 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:10 crc kubenswrapper[4753]: I1005 20:16:10.983721 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:10Z","lastTransitionTime":"2025-10-05T20:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.086092 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.086126 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.086166 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.086184 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.086193 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:11Z","lastTransitionTime":"2025-10-05T20:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.188998 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.189054 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.189070 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.189093 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.189109 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:11Z","lastTransitionTime":"2025-10-05T20:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.292427 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.292512 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.292532 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.292860 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.293192 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:11Z","lastTransitionTime":"2025-10-05T20:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.396469 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.396519 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.396537 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.396559 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.396576 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:11Z","lastTransitionTime":"2025-10-05T20:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.499115 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.499206 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.499225 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.499246 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.499262 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:11Z","lastTransitionTime":"2025-10-05T20:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.602769 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.602836 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.602858 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.602885 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.602905 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:11Z","lastTransitionTime":"2025-10-05T20:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.705656 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.705732 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.705755 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.705780 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.705797 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:11Z","lastTransitionTime":"2025-10-05T20:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.808567 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.808633 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.808655 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.808685 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.808708 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:11Z","lastTransitionTime":"2025-10-05T20:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.851728 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.851740 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.851790 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.851955 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:11 crc kubenswrapper[4753]: E1005 20:16:11.852097 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:11 crc kubenswrapper[4753]: E1005 20:16:11.852654 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:11 crc kubenswrapper[4753]: E1005 20:16:11.852765 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:11 crc kubenswrapper[4753]: E1005 20:16:11.852853 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.872877 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:11Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.892871 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:11Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.908900 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:11Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.911253 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.911313 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.911333 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.911387 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.911408 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:11Z","lastTransitionTime":"2025-10-05T20:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.924223 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:11Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.942659 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:11Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.955061 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b76c0217-b467-4806-8933-4afa084c51e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359ee28cd996a4068246945cb7d6b0cb2921ea96607659b72f61679d8e819242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3862dca9a162732558de66abc59d827f07c5ff1bd8a51d37d1799b60ba8b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3862dca9a162732558de66abc59d827f07c5ff1bd8a51d37d1799b60ba8b77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:11Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.970476 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:11Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:11 crc kubenswrapper[4753]: I1005 20:16:11.991154 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7ac7d15272eb1688a6a443c556a4ef57930c430084243bdff7df0d017044402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:16:02Z\\\",\\\"message\\\":\\\"2025-10-05T20:15:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b\\\\n2025-10-05T20:15:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b to /host/opt/cni/bin/\\\\n2025-10-05T20:15:17Z [verbose] multus-daemon started\\\\n2025-10-05T20:15:17Z [verbose] Readiness Indicator file check\\\\n2025-10-05T20:16:02Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:16:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:11Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.014020 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.014083 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.014091 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.014105 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.014114 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:12Z","lastTransitionTime":"2025-10-05T20:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.014874 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:12Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.031279 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:12Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.049345 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:12Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.066844 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:12Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.100235 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:43Z\\\",\\\"message\\\":\\\"t:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825939 6325 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825356 6325 port_cache.go:96] port-cache(openshift-network-diagnostics_network-check-target-xd92c): added port \\\\u0026{name:openshift-network-diagnostics_network-check-target-xd92c uuid:61897e97-c771-4738-8709-09636387cb00 logicalSwitch:crc ips:[0xc0094d9bc0] mac:[10 88 10 217 0 4] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.4/23] and MAC: 0a:58:0a:d9:00:04\\\\nI1005 20:15:43.826133 6325 pods.go:252] [openshift-network-diagnostics/network-check-target-xd\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:12Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.117009 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.117043 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.117053 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.117078 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.117088 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:12Z","lastTransitionTime":"2025-10-05T20:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.118206 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:12Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.138682 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:12Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.158431 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:12Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.171110 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:12Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.184978 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7b2637f-73a4-4dbb-b1f0-2ed149e34157\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3162645925f249750ae6ef0136b28459a949ab5097da9ac48841312a0c1cc43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf427d5366f0edf753e57bb958b0e30b93716e0d46779a5c6fcca2db6794c868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1629003efbc02c7e0f18f7960a8002d9834606d268109b5986ac3c9ef3d004b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:12Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.219090 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.219173 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.219191 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.219218 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.219237 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:12Z","lastTransitionTime":"2025-10-05T20:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.321990 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.322039 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.322048 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.322063 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.322072 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:12Z","lastTransitionTime":"2025-10-05T20:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.424860 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.424927 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.424946 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.424972 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.424988 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:12Z","lastTransitionTime":"2025-10-05T20:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.528018 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.528061 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.528069 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.528082 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.528092 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:12Z","lastTransitionTime":"2025-10-05T20:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.630402 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.630439 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.630448 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.630462 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.630472 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:12Z","lastTransitionTime":"2025-10-05T20:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.733027 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.733094 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.733111 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.733164 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.733188 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:12Z","lastTransitionTime":"2025-10-05T20:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.835625 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.835678 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.835690 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.835710 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.835755 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:12Z","lastTransitionTime":"2025-10-05T20:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.938984 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.939048 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.939065 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.939088 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:12 crc kubenswrapper[4753]: I1005 20:16:12.939110 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:12Z","lastTransitionTime":"2025-10-05T20:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.042360 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.042423 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.042444 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.042468 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.042485 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:13Z","lastTransitionTime":"2025-10-05T20:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.144610 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.144686 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.144707 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.144734 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.144754 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:13Z","lastTransitionTime":"2025-10-05T20:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.247024 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.247054 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.247062 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.247077 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.247085 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:13Z","lastTransitionTime":"2025-10-05T20:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.349445 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.349569 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.349589 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.349617 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.349641 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:13Z","lastTransitionTime":"2025-10-05T20:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.453411 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.453480 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.453502 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.453534 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.453555 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:13Z","lastTransitionTime":"2025-10-05T20:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.557025 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.557089 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.557107 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.557131 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.557196 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:13Z","lastTransitionTime":"2025-10-05T20:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.660659 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.660727 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.660750 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.660779 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.660799 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:13Z","lastTransitionTime":"2025-10-05T20:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.764531 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.764591 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.764659 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.764684 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.764702 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:13Z","lastTransitionTime":"2025-10-05T20:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.851982 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.852060 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.852177 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:13 crc kubenswrapper[4753]: E1005 20:16:13.852328 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.852365 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:13 crc kubenswrapper[4753]: E1005 20:16:13.853463 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:13 crc kubenswrapper[4753]: E1005 20:16:13.853539 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:13 crc kubenswrapper[4753]: E1005 20:16:13.853680 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.853889 4753 scope.go:117] "RemoveContainer" containerID="76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.867893 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.867953 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.867979 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.868009 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.868035 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:13Z","lastTransitionTime":"2025-10-05T20:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.972691 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.973825 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.973884 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.973921 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:13 crc kubenswrapper[4753]: I1005 20:16:13.973945 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:13Z","lastTransitionTime":"2025-10-05T20:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.077846 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.077949 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.078019 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.078100 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.078166 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:14Z","lastTransitionTime":"2025-10-05T20:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.182720 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.182768 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.182780 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.182797 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.182809 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:14Z","lastTransitionTime":"2025-10-05T20:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.285328 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.285378 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.285392 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.285412 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.285424 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:14Z","lastTransitionTime":"2025-10-05T20:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.327317 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovnkube-controller/2.log" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.330569 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerStarted","Data":"84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16"} Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.331069 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.357757 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.371004 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.381392 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.387512 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.387551 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.387562 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.387580 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.387591 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:14Z","lastTransitionTime":"2025-10-05T20:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.400219 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:43Z\\\",\\\"message\\\":\\\"t:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825939 6325 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825356 6325 port_cache.go:96] port-cache(openshift-network-diagnostics_network-check-target-xd92c): added port \\\\u0026{name:openshift-network-diagnostics_network-check-target-xd92c uuid:61897e97-c771-4738-8709-09636387cb00 logicalSwitch:crc ips:[0xc0094d9bc0] mac:[10 88 10 217 0 4] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.4/23] and MAC: 0a:58:0a:d9:00:04\\\\nI1005 20:15:43.826133 6325 pods.go:252] [openshift-network-diagnostics/network-check-target-xd\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.412278 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7b2637f-73a4-4dbb-b1f0-2ed149e34157\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3162645925f249750ae6ef0136b28459a949ab5097da9ac48841312a0c1cc43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf427d5366f0edf753e57bb958b0e30b93716e0d46779a5c6fcca2db6794c868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1629003efbc02c7e0f18f7960a8002d9834606d268109b5986ac3c9ef3d004b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.425029 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.435778 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.446111 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.455441 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.470917 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.484946 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.489941 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.489993 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.490007 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.490025 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.490037 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:14Z","lastTransitionTime":"2025-10-05T20:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.496915 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.508700 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.525340 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.540598 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.554983 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b76c0217-b467-4806-8933-4afa084c51e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359ee28cd996a4068246945cb7d6b0cb2921ea96607659b72f61679d8e819242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3862dca9a162732558de66abc59d827f07c5ff1bd8a51d37d1799b60ba8b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3862dca9a162732558de66abc59d827f07c5ff1bd8a51d37d1799b60ba8b77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.570365 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.582094 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7ac7d15272eb1688a6a443c556a4ef57930c430084243bdff7df0d017044402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:16:02Z\\\",\\\"message\\\":\\\"2025-10-05T20:15:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b\\\\n2025-10-05T20:15:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b to /host/opt/cni/bin/\\\\n2025-10-05T20:15:17Z [verbose] multus-daemon started\\\\n2025-10-05T20:15:17Z [verbose] Readiness Indicator file check\\\\n2025-10-05T20:16:02Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:16:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.593354 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.593391 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.593409 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.593433 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.593450 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:14Z","lastTransitionTime":"2025-10-05T20:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.697453 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.697517 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.697529 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.697550 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.697564 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:14Z","lastTransitionTime":"2025-10-05T20:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.800739 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.800797 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.800814 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.801259 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.801314 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:14Z","lastTransitionTime":"2025-10-05T20:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.907502 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.907548 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.907578 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.907602 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:14 crc kubenswrapper[4753]: I1005 20:16:14.907617 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:14Z","lastTransitionTime":"2025-10-05T20:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.010979 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.011022 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.011033 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.011051 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.011083 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:15Z","lastTransitionTime":"2025-10-05T20:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.113771 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.113810 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.113819 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.113835 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.113845 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:15Z","lastTransitionTime":"2025-10-05T20:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.218288 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.218348 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.218372 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.218398 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.218418 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:15Z","lastTransitionTime":"2025-10-05T20:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.321607 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.321668 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.321681 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.321699 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.321713 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:15Z","lastTransitionTime":"2025-10-05T20:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.336944 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovnkube-controller/3.log" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.338315 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovnkube-controller/2.log" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.341938 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerID="84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16" exitCode=1 Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.342008 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerDied","Data":"84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16"} Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.342073 4753 scope.go:117] "RemoveContainer" containerID="76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.344214 4753 scope.go:117] "RemoveContainer" containerID="84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16" Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.344851 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.370217 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.387559 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.402269 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.413495 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.429796 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.429851 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.429863 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.429884 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.429897 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:15Z","lastTransitionTime":"2025-10-05T20:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.431864 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.451031 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b76c0217-b467-4806-8933-4afa084c51e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359ee28cd996a4068246945cb7d6b0cb2921ea96607659b72f61679d8e819242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3862dca9a162732558de66abc59d827f07c5ff1bd8a51d37d1799b60ba8b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3862dca9a162732558de66abc59d827f07c5ff1bd8a51d37d1799b60ba8b77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.467910 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.491690 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7ac7d15272eb1688a6a443c556a4ef57930c430084243bdff7df0d017044402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:16:02Z\\\",\\\"message\\\":\\\"2025-10-05T20:15:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b\\\\n2025-10-05T20:15:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b to /host/opt/cni/bin/\\\\n2025-10-05T20:15:17Z [verbose] multus-daemon started\\\\n2025-10-05T20:15:17Z [verbose] Readiness Indicator file check\\\\n2025-10-05T20:16:02Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:16:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.508559 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.522608 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.534221 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.534263 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.534273 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.534294 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.534305 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:15Z","lastTransitionTime":"2025-10-05T20:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.539626 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.553506 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.566024 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.589930 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76035d414bdcd64887bac6c7155d45ee83d6f98d41b28f07576d9154b3cf1f5f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:15:43Z\\\",\\\"message\\\":\\\"t:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825939 6325 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1005 20:15:43.825356 6325 port_cache.go:96] port-cache(openshift-network-diagnostics_network-check-target-xd92c): added port \\\\u0026{name:openshift-network-diagnostics_network-check-target-xd92c uuid:61897e97-c771-4738-8709-09636387cb00 logicalSwitch:crc ips:[0xc0094d9bc0] mac:[10 88 10 217 0 4] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.4/23] and MAC: 0a:58:0a:d9:00:04\\\\nI1005 20:15:43.826133 6325 pods.go:252] [openshift-network-diagnostics/network-check-target-xd\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:16:14Z\\\",\\\"message\\\":\\\"l_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:16:14.785697 6719 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:16:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.604207 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7b2637f-73a4-4dbb-b1f0-2ed149e34157\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3162645925f249750ae6ef0136b28459a949ab5097da9ac48841312a0c1cc43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf427d5366f0edf753e57bb958b0e30b93716e0d46779a5c6fcca2db6794c868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1629003efbc02c7e0f18f7960a8002d9834606d268109b5986ac3c9ef3d004b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.619676 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.634887 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.635049 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.635478 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.63544588 +0000 UTC m=+148.483774152 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.635579 4753 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.635722 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.635680427 +0000 UTC m=+148.484008699 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.637340 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.637412 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.637441 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.637481 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.637510 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:15Z","lastTransitionTime":"2025-10-05T20:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.638078 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.655495 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:15Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.736502 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.736582 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.736649 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.736797 4753 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.736798 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.736860 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.736890 4753 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.736921 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.736882684 +0000 UTC m=+148.585210956 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.736953 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.737020 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.737041 4753 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.736963 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.736940837 +0000 UTC m=+148.585269109 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.737207 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.737180063 +0000 UTC m=+148.585508305 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.741004 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.741057 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.741085 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.741121 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.741183 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:15Z","lastTransitionTime":"2025-10-05T20:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.844195 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.844277 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.844296 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.844322 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.844342 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:15Z","lastTransitionTime":"2025-10-05T20:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.851654 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.851726 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.851673 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.851856 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.851984 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.852238 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.852320 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:15 crc kubenswrapper[4753]: E1005 20:16:15.852439 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.947743 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.947800 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.947815 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.947885 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:15 crc kubenswrapper[4753]: I1005 20:16:15.947905 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:15Z","lastTransitionTime":"2025-10-05T20:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.050876 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.051101 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.051133 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.051207 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.051248 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:16Z","lastTransitionTime":"2025-10-05T20:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.154178 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.154261 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.154282 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.154317 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.154338 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:16Z","lastTransitionTime":"2025-10-05T20:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.258284 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.258363 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.258382 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.258414 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.258436 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:16Z","lastTransitionTime":"2025-10-05T20:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.348045 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovnkube-controller/3.log" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.352859 4753 scope.go:117] "RemoveContainer" containerID="84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16" Oct 05 20:16:16 crc kubenswrapper[4753]: E1005 20:16:16.353034 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.360589 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.360630 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.360642 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.361335 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.361354 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:16Z","lastTransitionTime":"2025-10-05T20:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.364943 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.386716 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.404776 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.422112 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.434571 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.454311 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.464125 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.464383 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.464531 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.464684 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.464814 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:16Z","lastTransitionTime":"2025-10-05T20:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.470555 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.484277 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b76c0217-b467-4806-8933-4afa084c51e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359ee28cd996a4068246945cb7d6b0cb2921ea96607659b72f61679d8e819242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3862dca9a162732558de66abc59d827f07c5ff1bd8a51d37d1799b60ba8b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3862dca9a162732558de66abc59d827f07c5ff1bd8a51d37d1799b60ba8b77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.502118 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.522038 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7ac7d15272eb1688a6a443c556a4ef57930c430084243bdff7df0d017044402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:16:02Z\\\",\\\"message\\\":\\\"2025-10-05T20:15:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b\\\\n2025-10-05T20:15:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b to /host/opt/cni/bin/\\\\n2025-10-05T20:15:17Z [verbose] multus-daemon started\\\\n2025-10-05T20:15:17Z [verbose] Readiness Indicator file check\\\\n2025-10-05T20:16:02Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:16:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.541810 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.564271 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.568787 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.568843 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.568856 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.568876 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.568888 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:16Z","lastTransitionTime":"2025-10-05T20:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.582280 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.609822 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:16:14Z\\\",\\\"message\\\":\\\"l_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:16:14.785697 6719 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:16:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.625173 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7b2637f-73a4-4dbb-b1f0-2ed149e34157\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3162645925f249750ae6ef0136b28459a949ab5097da9ac48841312a0c1cc43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf427d5366f0edf753e57bb958b0e30b93716e0d46779a5c6fcca2db6794c868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1629003efbc02c7e0f18f7960a8002d9834606d268109b5986ac3c9ef3d004b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.642716 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.664506 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.671369 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.671486 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.671566 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.671631 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.671688 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:16Z","lastTransitionTime":"2025-10-05T20:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.696617 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:16Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.774190 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.774228 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.774239 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.774256 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.774268 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:16Z","lastTransitionTime":"2025-10-05T20:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.878133 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.878240 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.878264 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.878291 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.878312 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:16Z","lastTransitionTime":"2025-10-05T20:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.982381 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.982458 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.982481 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.982513 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:16 crc kubenswrapper[4753]: I1005 20:16:16.982542 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:16Z","lastTransitionTime":"2025-10-05T20:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.086569 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.086655 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.086688 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.086722 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.086745 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:17Z","lastTransitionTime":"2025-10-05T20:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.190675 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.190732 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.190749 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.190775 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.190795 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:17Z","lastTransitionTime":"2025-10-05T20:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.294977 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.295047 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.295069 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.295103 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.295133 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:17Z","lastTransitionTime":"2025-10-05T20:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.398515 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.398583 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.398602 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.398631 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.398656 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:17Z","lastTransitionTime":"2025-10-05T20:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.502057 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.502193 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.502216 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.502278 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.502300 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:17Z","lastTransitionTime":"2025-10-05T20:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.606379 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.606422 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.606436 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.606454 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.606466 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:17Z","lastTransitionTime":"2025-10-05T20:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.708800 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.708892 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.708949 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.708976 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.709039 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:17Z","lastTransitionTime":"2025-10-05T20:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.811742 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.811813 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.811831 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.811858 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.811877 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:17Z","lastTransitionTime":"2025-10-05T20:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.851714 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.851999 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:17 crc kubenswrapper[4753]: E1005 20:16:17.852215 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.852319 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:17 crc kubenswrapper[4753]: E1005 20:16:17.852719 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.852924 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:17 crc kubenswrapper[4753]: E1005 20:16:17.853039 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:17 crc kubenswrapper[4753]: E1005 20:16:17.853154 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.915422 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.916039 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.916063 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.916091 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:17 crc kubenswrapper[4753]: I1005 20:16:17.916177 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:17Z","lastTransitionTime":"2025-10-05T20:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.019762 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.019817 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.019830 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.019865 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.019880 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:18Z","lastTransitionTime":"2025-10-05T20:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.123381 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.123474 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.123493 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.123542 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.123562 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:18Z","lastTransitionTime":"2025-10-05T20:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.226891 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.226976 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.227001 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.227040 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.227072 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:18Z","lastTransitionTime":"2025-10-05T20:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.330517 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.330573 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.330590 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.330613 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.330631 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:18Z","lastTransitionTime":"2025-10-05T20:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.434880 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.434969 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.435003 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.435035 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.435073 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:18Z","lastTransitionTime":"2025-10-05T20:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.538068 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.538179 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.538210 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.538248 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.538276 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:18Z","lastTransitionTime":"2025-10-05T20:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.642694 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.642798 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.642822 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.642856 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.642880 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:18Z","lastTransitionTime":"2025-10-05T20:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.745703 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.745739 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.745747 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.745763 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.745773 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:18Z","lastTransitionTime":"2025-10-05T20:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.849048 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.849087 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.849096 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.849112 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.849123 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:18Z","lastTransitionTime":"2025-10-05T20:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.952946 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.953042 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.953056 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.953073 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:18 crc kubenswrapper[4753]: I1005 20:16:18.953085 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:18Z","lastTransitionTime":"2025-10-05T20:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.056051 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.056116 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.056131 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.056173 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.056185 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:19Z","lastTransitionTime":"2025-10-05T20:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.159563 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.159597 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.159608 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.159622 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.159632 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:19Z","lastTransitionTime":"2025-10-05T20:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.262522 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.262557 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.262569 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.262586 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.262597 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:19Z","lastTransitionTime":"2025-10-05T20:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.365290 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.365359 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.365383 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.365410 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.365430 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:19Z","lastTransitionTime":"2025-10-05T20:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.467685 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.467749 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.467766 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.467790 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.467808 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:19Z","lastTransitionTime":"2025-10-05T20:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.506583 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.506668 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.506692 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.506754 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.506774 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:19Z","lastTransitionTime":"2025-10-05T20:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:19 crc kubenswrapper[4753]: E1005 20:16:19.527431 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.531956 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.532077 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.532101 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.532243 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.532276 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:19Z","lastTransitionTime":"2025-10-05T20:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:19 crc kubenswrapper[4753]: E1005 20:16:19.551443 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.555774 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.555831 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.555854 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.555883 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.555906 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:19Z","lastTransitionTime":"2025-10-05T20:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:19 crc kubenswrapper[4753]: E1005 20:16:19.575492 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.579104 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.579186 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.579212 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.579242 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.579263 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:19Z","lastTransitionTime":"2025-10-05T20:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:19 crc kubenswrapper[4753]: E1005 20:16:19.592378 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.597461 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.597534 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.597544 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.597564 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.597575 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:19Z","lastTransitionTime":"2025-10-05T20:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:19 crc kubenswrapper[4753]: E1005 20:16:19.613804 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:19Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:19 crc kubenswrapper[4753]: E1005 20:16:19.614031 4753 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.616008 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.616065 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.616088 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.616118 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.616178 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:19Z","lastTransitionTime":"2025-10-05T20:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.719210 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.719274 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.719290 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.719315 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.719333 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:19Z","lastTransitionTime":"2025-10-05T20:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.822724 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.822783 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.822805 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.822838 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.822861 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:19Z","lastTransitionTime":"2025-10-05T20:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.852015 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.852023 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:19 crc kubenswrapper[4753]: E1005 20:16:19.852126 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.852170 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.852333 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:19 crc kubenswrapper[4753]: E1005 20:16:19.852375 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:19 crc kubenswrapper[4753]: E1005 20:16:19.852674 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:19 crc kubenswrapper[4753]: E1005 20:16:19.852842 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.865991 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.925174 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.925235 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.925253 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.925278 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:19 crc kubenswrapper[4753]: I1005 20:16:19.925295 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:19Z","lastTransitionTime":"2025-10-05T20:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.029617 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.029719 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.029737 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.029765 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.029790 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:20Z","lastTransitionTime":"2025-10-05T20:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.132546 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.132579 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.132589 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.132604 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.132613 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:20Z","lastTransitionTime":"2025-10-05T20:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.234944 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.234971 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.234979 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.234991 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.234999 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:20Z","lastTransitionTime":"2025-10-05T20:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.337661 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.337761 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.337779 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.337803 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.337820 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:20Z","lastTransitionTime":"2025-10-05T20:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.440860 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.440920 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.440938 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.440962 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.440981 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:20Z","lastTransitionTime":"2025-10-05T20:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.543990 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.544102 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.544126 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.544211 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.544237 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:20Z","lastTransitionTime":"2025-10-05T20:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.647448 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.647505 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.647526 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.647550 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.647569 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:20Z","lastTransitionTime":"2025-10-05T20:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.750710 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.750771 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.750792 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.750833 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.750869 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:20Z","lastTransitionTime":"2025-10-05T20:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.853660 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.853795 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.853814 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.853839 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.853860 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:20Z","lastTransitionTime":"2025-10-05T20:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.957394 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.957484 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.957503 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.957526 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:20 crc kubenswrapper[4753]: I1005 20:16:20.957543 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:20Z","lastTransitionTime":"2025-10-05T20:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.060235 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.060310 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.060333 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.060362 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.060384 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:21Z","lastTransitionTime":"2025-10-05T20:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.163568 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.163628 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.163646 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.163672 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.163689 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:21Z","lastTransitionTime":"2025-10-05T20:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.266223 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.266281 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.266298 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.266323 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.266341 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:21Z","lastTransitionTime":"2025-10-05T20:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.369308 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.369372 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.369389 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.369416 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.369434 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:21Z","lastTransitionTime":"2025-10-05T20:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.472774 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.472833 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.472853 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.472878 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.472895 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:21Z","lastTransitionTime":"2025-10-05T20:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.576360 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.576425 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.576444 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.576469 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.576485 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:21Z","lastTransitionTime":"2025-10-05T20:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.679413 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.679500 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.679521 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.679601 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.679675 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:21Z","lastTransitionTime":"2025-10-05T20:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.783212 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.783280 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.783302 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.783330 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.783348 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:21Z","lastTransitionTime":"2025-10-05T20:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.851472 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.851489 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.851489 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:21 crc kubenswrapper[4753]: E1005 20:16:21.851692 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.851732 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:21 crc kubenswrapper[4753]: E1005 20:16:21.851877 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:21 crc kubenswrapper[4753]: E1005 20:16:21.851988 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:21 crc kubenswrapper[4753]: E1005 20:16:21.852099 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.872340 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7b2637f-73a4-4dbb-b1f0-2ed149e34157\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3162645925f249750ae6ef0136b28459a949ab5097da9ac48841312a0c1cc43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf427d5366f0edf753e57bb958b0e30b93716e0d46779a5c6fcca2db6794c868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1629003efbc02c7e0f18f7960a8002d9834606d268109b5986ac3c9ef3d004b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.886538 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.886618 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.886666 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.886691 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.886710 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:21Z","lastTransitionTime":"2025-10-05T20:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.895750 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.912290 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.930124 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.954053 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.975280 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.988876 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.988937 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.988955 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.988985 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.989003 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:21Z","lastTransitionTime":"2025-10-05T20:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:21 crc kubenswrapper[4753]: I1005 20:16:21.994684 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:21Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.012241 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.027797 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.067874 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4af6e01-6222-4b13-9f88-e9667cc076d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b23c13ec0f79486f1f239164b8ef9cfccf612b25349546cdcb47dcc797acae1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b934eb95fb0ebd4eef3cbd8edadc0df5cd8c2e6f2c2559ed3441081869c0bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b000406bb8af6650fe0b747727019be9448e6c19b6469e01a2355933688e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0198c51cab04bd8c2b4c225c54462dfaf7a92a00aa357463cef492efa09a69bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49a91a72e74f6ab35cb62c18fad465ec6fe891f9802978677f3ef0f1d1b4ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e86b41fe3af548448915a06369be226f2973795d2ac7b4da89e5611d9fdc548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e86b41fe3af548448915a06369be226f2973795d2ac7b4da89e5611d9fdc548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc9957b862552c9cbaca63673c87e9d088e99264455f9e3db8b572dbe9be0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fefc9957b862552c9cbaca63673c87e9d088e99264455f9e3db8b572dbe9be0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://60975f522caf98398fb75fe07cf9f73203a2000860de4ca7bba29b376817270a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60975f522caf98398fb75fe07cf9f73203a2000860de4ca7bba29b376817270a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.085712 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b76c0217-b467-4806-8933-4afa084c51e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359ee28cd996a4068246945cb7d6b0cb2921ea96607659b72f61679d8e819242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3862dca9a162732558de66abc59d827f07c5ff1bd8a51d37d1799b60ba8b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3862dca9a162732558de66abc59d827f07c5ff1bd8a51d37d1799b60ba8b77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.094363 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.094405 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.094417 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.094433 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.094444 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:22Z","lastTransitionTime":"2025-10-05T20:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.103463 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.119768 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7ac7d15272eb1688a6a443c556a4ef57930c430084243bdff7df0d017044402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:16:02Z\\\",\\\"message\\\":\\\"2025-10-05T20:15:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b\\\\n2025-10-05T20:15:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b to /host/opt/cni/bin/\\\\n2025-10-05T20:15:17Z [verbose] multus-daemon started\\\\n2025-10-05T20:15:17Z [verbose] Readiness Indicator file check\\\\n2025-10-05T20:16:02Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:16:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.137972 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.151842 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.169099 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.185718 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.197969 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.198053 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.198094 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.198118 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.198180 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:22Z","lastTransitionTime":"2025-10-05T20:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.201229 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.224169 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:16:14Z\\\",\\\"message\\\":\\\"l_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:16:14.785697 6719 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:16:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:22Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.300402 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.300462 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.300474 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.300498 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.300510 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:22Z","lastTransitionTime":"2025-10-05T20:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.404295 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.404365 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.404394 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.404431 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.404457 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:22Z","lastTransitionTime":"2025-10-05T20:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.507608 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.507679 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.507698 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.507725 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.507743 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:22Z","lastTransitionTime":"2025-10-05T20:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.611775 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.611867 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.611893 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.611926 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.611949 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:22Z","lastTransitionTime":"2025-10-05T20:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.715412 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.715497 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.715523 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.715557 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.715581 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:22Z","lastTransitionTime":"2025-10-05T20:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.818445 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.818497 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.818515 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.818539 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.818557 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:22Z","lastTransitionTime":"2025-10-05T20:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.921094 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.921180 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.921201 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.921224 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:22 crc kubenswrapper[4753]: I1005 20:16:22.921243 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:22Z","lastTransitionTime":"2025-10-05T20:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.024594 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.024649 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.024667 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.024691 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.024708 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:23Z","lastTransitionTime":"2025-10-05T20:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.128586 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.128659 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.128678 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.128705 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.128722 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:23Z","lastTransitionTime":"2025-10-05T20:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.232379 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.232440 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.232459 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.232487 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.232504 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:23Z","lastTransitionTime":"2025-10-05T20:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.335381 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.335439 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.335457 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.335480 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.335499 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:23Z","lastTransitionTime":"2025-10-05T20:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.439408 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.439536 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.439563 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.439594 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.439615 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:23Z","lastTransitionTime":"2025-10-05T20:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.542868 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.542919 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.542934 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.542952 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.542965 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:23Z","lastTransitionTime":"2025-10-05T20:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.646188 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.646260 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.646279 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.646308 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.646328 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:23Z","lastTransitionTime":"2025-10-05T20:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.748747 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.748828 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.748849 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.748881 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.748900 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:23Z","lastTransitionTime":"2025-10-05T20:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.851244 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.851244 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.851444 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:23 crc kubenswrapper[4753]: E1005 20:16:23.851707 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.852086 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:23 crc kubenswrapper[4753]: E1005 20:16:23.852270 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:23 crc kubenswrapper[4753]: E1005 20:16:23.852394 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:23 crc kubenswrapper[4753]: E1005 20:16:23.852481 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.853771 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.853863 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.853882 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.853945 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.853967 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:23Z","lastTransitionTime":"2025-10-05T20:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.957222 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.957280 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.957302 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.957324 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:23 crc kubenswrapper[4753]: I1005 20:16:23.957340 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:23Z","lastTransitionTime":"2025-10-05T20:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.061273 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.061353 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.061374 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.061402 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.061423 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:24Z","lastTransitionTime":"2025-10-05T20:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.164884 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.164971 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.164989 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.165015 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.165029 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:24Z","lastTransitionTime":"2025-10-05T20:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.268196 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.268253 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.268272 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.268295 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.268314 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:24Z","lastTransitionTime":"2025-10-05T20:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.371861 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.371913 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.371925 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.371945 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.371960 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:24Z","lastTransitionTime":"2025-10-05T20:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.475852 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.475898 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.475909 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.475928 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.475941 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:24Z","lastTransitionTime":"2025-10-05T20:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.578970 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.579025 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.579097 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.579123 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.579167 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:24Z","lastTransitionTime":"2025-10-05T20:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.682578 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.682646 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.682665 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.682692 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.682709 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:24Z","lastTransitionTime":"2025-10-05T20:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.786376 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.786453 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.786471 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.786496 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.786512 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:24Z","lastTransitionTime":"2025-10-05T20:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.892783 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.892963 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.892983 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.893008 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.893026 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:24Z","lastTransitionTime":"2025-10-05T20:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.996244 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.996319 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.996337 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.996363 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:24 crc kubenswrapper[4753]: I1005 20:16:24.996381 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:24Z","lastTransitionTime":"2025-10-05T20:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.100248 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.100313 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.100332 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.100356 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.100377 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:25Z","lastTransitionTime":"2025-10-05T20:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.204045 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.204205 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.204232 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.204268 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.204292 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:25Z","lastTransitionTime":"2025-10-05T20:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.307504 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.307575 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.307593 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.307669 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.307692 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:25Z","lastTransitionTime":"2025-10-05T20:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.410829 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.410898 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.410922 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.410958 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.410982 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:25Z","lastTransitionTime":"2025-10-05T20:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.514272 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.514358 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.514377 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.514412 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.514441 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:25Z","lastTransitionTime":"2025-10-05T20:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.617953 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.618008 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.618026 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.618051 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.618074 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:25Z","lastTransitionTime":"2025-10-05T20:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.720133 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.720193 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.720202 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.720215 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.720242 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:25Z","lastTransitionTime":"2025-10-05T20:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.823490 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.823525 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.823535 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.823547 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.823557 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:25Z","lastTransitionTime":"2025-10-05T20:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.852450 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.852650 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:25 crc kubenswrapper[4753]: E1005 20:16:25.852674 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.852052 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:25 crc kubenswrapper[4753]: E1005 20:16:25.852801 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.852817 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:25 crc kubenswrapper[4753]: E1005 20:16:25.852996 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:25 crc kubenswrapper[4753]: E1005 20:16:25.853296 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.926405 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.926462 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.926480 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.926503 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:25 crc kubenswrapper[4753]: I1005 20:16:25.926521 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:25Z","lastTransitionTime":"2025-10-05T20:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.029197 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.029802 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.029909 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.029992 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.030070 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:26Z","lastTransitionTime":"2025-10-05T20:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.132704 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.132773 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.132797 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.132826 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.132850 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:26Z","lastTransitionTime":"2025-10-05T20:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.235291 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.235351 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.235370 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.235398 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.235416 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:26Z","lastTransitionTime":"2025-10-05T20:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.338904 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.338934 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.338943 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.338957 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.338966 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:26Z","lastTransitionTime":"2025-10-05T20:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.441651 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.441706 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.441719 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.441735 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.441746 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:26Z","lastTransitionTime":"2025-10-05T20:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.544027 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.544082 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.544094 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.544108 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.544118 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:26Z","lastTransitionTime":"2025-10-05T20:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.646298 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.646536 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.646610 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.646695 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.646778 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:26Z","lastTransitionTime":"2025-10-05T20:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.748545 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.748574 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.748582 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.748594 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.748604 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:26Z","lastTransitionTime":"2025-10-05T20:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.851652 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.851931 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.852011 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.852093 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.852201 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:26Z","lastTransitionTime":"2025-10-05T20:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.954636 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.954689 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.954706 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.954729 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:26 crc kubenswrapper[4753]: I1005 20:16:26.954746 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:26Z","lastTransitionTime":"2025-10-05T20:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.057753 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.057804 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.057818 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.057834 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.057844 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:27Z","lastTransitionTime":"2025-10-05T20:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.161699 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.161757 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.161774 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.161796 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.161812 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:27Z","lastTransitionTime":"2025-10-05T20:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.264114 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.264397 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.264519 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.264638 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.264744 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:27Z","lastTransitionTime":"2025-10-05T20:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.368054 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.368190 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.368209 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.368233 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.368251 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:27Z","lastTransitionTime":"2025-10-05T20:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.470178 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.470403 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.470479 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.470597 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.470674 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:27Z","lastTransitionTime":"2025-10-05T20:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.573337 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.573387 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.573398 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.573415 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.573427 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:27Z","lastTransitionTime":"2025-10-05T20:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.676061 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.676451 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.676648 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.676847 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.677038 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:27Z","lastTransitionTime":"2025-10-05T20:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.779425 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.779480 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.779494 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.779511 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.779521 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:27Z","lastTransitionTime":"2025-10-05T20:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.851070 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.851195 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:27 crc kubenswrapper[4753]: E1005 20:16:27.851213 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.851079 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:27 crc kubenswrapper[4753]: E1005 20:16:27.851327 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:27 crc kubenswrapper[4753]: E1005 20:16:27.851470 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.851609 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:27 crc kubenswrapper[4753]: E1005 20:16:27.851766 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.881602 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.881637 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.881646 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.881660 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.881670 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:27Z","lastTransitionTime":"2025-10-05T20:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.983932 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.984179 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.984247 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.984316 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:27 crc kubenswrapper[4753]: I1005 20:16:27.984387 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:27Z","lastTransitionTime":"2025-10-05T20:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.087266 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.087768 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.087836 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.087910 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.087970 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:28Z","lastTransitionTime":"2025-10-05T20:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.191282 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.191336 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.191352 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.191374 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.191391 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:28Z","lastTransitionTime":"2025-10-05T20:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.293644 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.294012 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.294202 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.294373 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.294544 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:28Z","lastTransitionTime":"2025-10-05T20:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.398015 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.398078 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.398096 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.398120 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.398163 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:28Z","lastTransitionTime":"2025-10-05T20:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.501155 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.501210 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.501226 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.501248 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.501263 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:28Z","lastTransitionTime":"2025-10-05T20:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.604737 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.604803 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.604821 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.604846 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.604864 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:28Z","lastTransitionTime":"2025-10-05T20:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.708184 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.708525 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.708692 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.708837 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.708971 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:28Z","lastTransitionTime":"2025-10-05T20:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.812936 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.813003 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.813027 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.813052 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.813070 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:28Z","lastTransitionTime":"2025-10-05T20:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.916464 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.916529 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.916547 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.916575 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:28 crc kubenswrapper[4753]: I1005 20:16:28.916596 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:28Z","lastTransitionTime":"2025-10-05T20:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.019803 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.019882 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.019908 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.019938 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.019955 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:29Z","lastTransitionTime":"2025-10-05T20:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.123415 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.124245 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.124409 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.124572 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.124714 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:29Z","lastTransitionTime":"2025-10-05T20:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.227780 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.227846 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.227873 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.227904 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.227925 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:29Z","lastTransitionTime":"2025-10-05T20:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.331364 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.331431 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.331458 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.331486 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.331508 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:29Z","lastTransitionTime":"2025-10-05T20:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.434722 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.434777 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.434793 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.434815 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.434832 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:29Z","lastTransitionTime":"2025-10-05T20:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.538530 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.538597 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.538619 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.538643 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.538663 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:29Z","lastTransitionTime":"2025-10-05T20:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.642509 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.642584 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.642608 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.642635 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.642653 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:29Z","lastTransitionTime":"2025-10-05T20:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.733640 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.733890 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.734021 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.734114 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.734230 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:29Z","lastTransitionTime":"2025-10-05T20:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:29 crc kubenswrapper[4753]: E1005 20:16:29.750595 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:29Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.755170 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.755240 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.755265 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.755296 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.755317 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:29Z","lastTransitionTime":"2025-10-05T20:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:29 crc kubenswrapper[4753]: E1005 20:16:29.773788 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:29Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.778014 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.778085 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.778102 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.778126 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.778169 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:29Z","lastTransitionTime":"2025-10-05T20:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:29 crc kubenswrapper[4753]: E1005 20:16:29.794906 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:29Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.799385 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.799531 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.799620 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.799711 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.799811 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:29Z","lastTransitionTime":"2025-10-05T20:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:29 crc kubenswrapper[4753]: E1005 20:16:29.816081 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:29Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.820221 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.820362 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.820456 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.820557 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.820787 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:29Z","lastTransitionTime":"2025-10-05T20:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:29 crc kubenswrapper[4753]: E1005 20:16:29.837698 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:29Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:29 crc kubenswrapper[4753]: E1005 20:16:29.837927 4753 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.839898 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.840505 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.840721 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.840912 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.841105 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:29Z","lastTransitionTime":"2025-10-05T20:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.852561 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.852715 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:29 crc kubenswrapper[4753]: E1005 20:16:29.852781 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:29 crc kubenswrapper[4753]: E1005 20:16:29.852934 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.852973 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:29 crc kubenswrapper[4753]: E1005 20:16:29.853209 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.852980 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:29 crc kubenswrapper[4753]: E1005 20:16:29.853462 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.853946 4753 scope.go:117] "RemoveContainer" containerID="84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16" Oct 05 20:16:29 crc kubenswrapper[4753]: E1005 20:16:29.854207 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.943411 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.943489 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.943513 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.943543 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:29 crc kubenswrapper[4753]: I1005 20:16:29.943566 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:29Z","lastTransitionTime":"2025-10-05T20:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.048512 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.048554 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.048564 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.048579 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.048587 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:30Z","lastTransitionTime":"2025-10-05T20:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.151718 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.151750 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.151760 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.151776 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.151787 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:30Z","lastTransitionTime":"2025-10-05T20:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.254819 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.254865 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.254881 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.254902 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.254918 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:30Z","lastTransitionTime":"2025-10-05T20:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.357619 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.357675 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.357691 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.357713 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.357729 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:30Z","lastTransitionTime":"2025-10-05T20:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.460334 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.460415 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.460439 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.460475 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.460497 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:30Z","lastTransitionTime":"2025-10-05T20:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.562062 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.562098 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.562109 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.562125 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.562160 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:30Z","lastTransitionTime":"2025-10-05T20:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.664554 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.664592 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.664603 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.664620 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.664632 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:30Z","lastTransitionTime":"2025-10-05T20:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.767717 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.767751 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.767760 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.767775 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.767784 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:30Z","lastTransitionTime":"2025-10-05T20:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.869536 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.869591 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.869608 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.869631 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.869647 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:30Z","lastTransitionTime":"2025-10-05T20:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.972244 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.972280 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.972289 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.972303 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:30 crc kubenswrapper[4753]: I1005 20:16:30.972311 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:30Z","lastTransitionTime":"2025-10-05T20:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.074399 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.074480 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.074503 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.074529 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.074546 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:31Z","lastTransitionTime":"2025-10-05T20:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.177295 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.177423 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.177442 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.177465 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.177481 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:31Z","lastTransitionTime":"2025-10-05T20:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.280002 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.280043 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.280054 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.280069 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.280081 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:31Z","lastTransitionTime":"2025-10-05T20:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.382370 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.382421 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.382436 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.382456 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.382469 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:31Z","lastTransitionTime":"2025-10-05T20:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.485271 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.485324 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.485339 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.485361 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.485375 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:31Z","lastTransitionTime":"2025-10-05T20:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.587916 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.587979 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.587998 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.588020 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.588037 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:31Z","lastTransitionTime":"2025-10-05T20:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.692195 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.692288 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.692311 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.692336 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.692355 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:31Z","lastTransitionTime":"2025-10-05T20:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.794925 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.795000 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.795022 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.795045 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.795063 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:31Z","lastTransitionTime":"2025-10-05T20:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.851800 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.851834 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.851884 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.851914 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:31 crc kubenswrapper[4753]: E1005 20:16:31.852063 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:31 crc kubenswrapper[4753]: E1005 20:16:31.852701 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:31 crc kubenswrapper[4753]: E1005 20:16:31.852857 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:31 crc kubenswrapper[4753]: E1005 20:16:31.853031 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.868990 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7b2637f-73a4-4dbb-b1f0-2ed149e34157\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3162645925f249750ae6ef0136b28459a949ab5097da9ac48841312a0c1cc43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf427d5366f0edf753e57bb958b0e30b93716e0d46779a5c6fcca2db6794c868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1629003efbc02c7e0f18f7960a8002d9834606d268109b5986ac3c9ef3d004b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:31Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.886649 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:31Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.898634 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.898712 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.898730 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.898754 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.898800 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:31Z","lastTransitionTime":"2025-10-05T20:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.903492 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:31Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.919627 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:31Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.941432 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:31Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.961831 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:31Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.985684 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:31Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:31 crc kubenswrapper[4753]: I1005 20:16:31.995761 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:31Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.003928 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.004023 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.004070 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.004093 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.004109 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:32Z","lastTransitionTime":"2025-10-05T20:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.005360 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.027537 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4af6e01-6222-4b13-9f88-e9667cc076d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b23c13ec0f79486f1f239164b8ef9cfccf612b25349546cdcb47dcc797acae1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b934eb95fb0ebd4eef3cbd8edadc0df5cd8c2e6f2c2559ed3441081869c0bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b000406bb8af6650fe0b747727019be9448e6c19b6469e01a2355933688e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0198c51cab04bd8c2b4c225c54462dfaf7a92a00aa357463cef492efa09a69bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49a91a72e74f6ab35cb62c18fad465ec6fe891f9802978677f3ef0f1d1b4ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e86b41fe3af548448915a06369be226f2973795d2ac7b4da89e5611d9fdc548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e86b41fe3af548448915a06369be226f2973795d2ac7b4da89e5611d9fdc548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc9957b862552c9cbaca63673c87e9d088e99264455f9e3db8b572dbe9be0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fefc9957b862552c9cbaca63673c87e9d088e99264455f9e3db8b572dbe9be0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://60975f522caf98398fb75fe07cf9f73203a2000860de4ca7bba29b376817270a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60975f522caf98398fb75fe07cf9f73203a2000860de4ca7bba29b376817270a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.040214 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b76c0217-b467-4806-8933-4afa084c51e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359ee28cd996a4068246945cb7d6b0cb2921ea96607659b72f61679d8e819242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3862dca9a162732558de66abc59d827f07c5ff1bd8a51d37d1799b60ba8b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3862dca9a162732558de66abc59d827f07c5ff1bd8a51d37d1799b60ba8b77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.056075 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.065816 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs\") pod \"network-metrics-daemon-ktspr\" (UID: \"f99b8ef3-70ed-42e4-9217-a300fcd562d9\") " pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:32 crc kubenswrapper[4753]: E1005 20:16:32.066181 4753 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 05 20:16:32 crc kubenswrapper[4753]: E1005 20:16:32.066319 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs podName:f99b8ef3-70ed-42e4-9217-a300fcd562d9 nodeName:}" failed. No retries permitted until 2025-10-05 20:17:36.066287603 +0000 UTC m=+164.914615865 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs") pod "network-metrics-daemon-ktspr" (UID: "f99b8ef3-70ed-42e4-9217-a300fcd562d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.075613 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7ac7d15272eb1688a6a443c556a4ef57930c430084243bdff7df0d017044402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:16:02Z\\\",\\\"message\\\":\\\"2025-10-05T20:15:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b\\\\n2025-10-05T20:15:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b to /host/opt/cni/bin/\\\\n2025-10-05T20:15:17Z [verbose] multus-daemon started\\\\n2025-10-05T20:15:17Z [verbose] Readiness Indicator file check\\\\n2025-10-05T20:16:02Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:16:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.095117 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.108556 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.108627 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.108650 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.108693 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.108717 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:32Z","lastTransitionTime":"2025-10-05T20:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.110950 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.126788 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.142119 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.158762 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.185888 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:16:14Z\\\",\\\"message\\\":\\\"l_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:16:14.785697 6719 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:16:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:32Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.212170 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.212208 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.212216 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.212231 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.212240 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:32Z","lastTransitionTime":"2025-10-05T20:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.315418 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.315469 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.315488 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.315513 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.315532 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:32Z","lastTransitionTime":"2025-10-05T20:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.418038 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.418088 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.418104 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.418131 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.418185 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:32Z","lastTransitionTime":"2025-10-05T20:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.521361 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.521416 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.521432 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.521455 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.521478 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:32Z","lastTransitionTime":"2025-10-05T20:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.624056 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.624115 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.624136 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.624211 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.624232 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:32Z","lastTransitionTime":"2025-10-05T20:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.727331 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.727646 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.727834 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.728004 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.728175 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:32Z","lastTransitionTime":"2025-10-05T20:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.830613 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.830683 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.830705 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.830733 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.830754 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:32Z","lastTransitionTime":"2025-10-05T20:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.934064 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.934116 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.934133 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.934180 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:32 crc kubenswrapper[4753]: I1005 20:16:32.934198 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:32Z","lastTransitionTime":"2025-10-05T20:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.037199 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.037677 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.037842 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.038032 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.038281 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:33Z","lastTransitionTime":"2025-10-05T20:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.141560 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.141620 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.141639 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.141664 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.141682 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:33Z","lastTransitionTime":"2025-10-05T20:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.246032 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.247178 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.247386 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.247577 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.247720 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:33Z","lastTransitionTime":"2025-10-05T20:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.352118 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.352216 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.352234 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.352258 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.352274 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:33Z","lastTransitionTime":"2025-10-05T20:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.455318 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.455369 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.455387 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.455411 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.455431 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:33Z","lastTransitionTime":"2025-10-05T20:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.558906 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.558966 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.558989 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.559019 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.559041 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:33Z","lastTransitionTime":"2025-10-05T20:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.662503 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.662563 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.662579 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.662601 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.662617 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:33Z","lastTransitionTime":"2025-10-05T20:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.765913 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.765976 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.766000 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.766028 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.766048 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:33Z","lastTransitionTime":"2025-10-05T20:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.851673 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.851714 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.851798 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.851899 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:33 crc kubenswrapper[4753]: E1005 20:16:33.851884 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:33 crc kubenswrapper[4753]: E1005 20:16:33.852086 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:33 crc kubenswrapper[4753]: E1005 20:16:33.852209 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:33 crc kubenswrapper[4753]: E1005 20:16:33.852522 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.868631 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.868680 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.868696 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.868718 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.868736 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:33Z","lastTransitionTime":"2025-10-05T20:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.971566 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.971631 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.971650 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.971679 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:33 crc kubenswrapper[4753]: I1005 20:16:33.971701 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:33Z","lastTransitionTime":"2025-10-05T20:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.074887 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.074926 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.074934 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.074948 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.074959 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:34Z","lastTransitionTime":"2025-10-05T20:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.178278 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.178318 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.178329 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.178344 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.178355 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:34Z","lastTransitionTime":"2025-10-05T20:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.280297 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.280323 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.280333 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.280346 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.280354 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:34Z","lastTransitionTime":"2025-10-05T20:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.383203 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.383245 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.383256 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.383274 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.383285 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:34Z","lastTransitionTime":"2025-10-05T20:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.486097 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.486205 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.486226 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.486251 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.486271 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:34Z","lastTransitionTime":"2025-10-05T20:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.588904 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.588945 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.588958 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.588976 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.588988 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:34Z","lastTransitionTime":"2025-10-05T20:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.691555 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.691596 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.691605 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.691619 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.691628 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:34Z","lastTransitionTime":"2025-10-05T20:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.793758 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.793798 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.793810 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.793826 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.793837 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:34Z","lastTransitionTime":"2025-10-05T20:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.895691 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.895744 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.895755 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.895773 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.895785 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:34Z","lastTransitionTime":"2025-10-05T20:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.998268 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.998329 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.998340 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.998357 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:34 crc kubenswrapper[4753]: I1005 20:16:34.998368 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:34Z","lastTransitionTime":"2025-10-05T20:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.100415 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.100467 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.100482 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.100506 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.100520 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:35Z","lastTransitionTime":"2025-10-05T20:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.202494 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.202539 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.202554 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.202575 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.202590 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:35Z","lastTransitionTime":"2025-10-05T20:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.304659 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.304693 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.304702 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.304717 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.304727 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:35Z","lastTransitionTime":"2025-10-05T20:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.407364 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.407413 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.407430 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.407450 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.407465 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:35Z","lastTransitionTime":"2025-10-05T20:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.509308 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.509391 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.509415 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.509446 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.509466 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:35Z","lastTransitionTime":"2025-10-05T20:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.612075 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.612105 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.612115 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.612130 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.612170 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:35Z","lastTransitionTime":"2025-10-05T20:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.713915 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.713951 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.713962 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.713978 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.713988 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:35Z","lastTransitionTime":"2025-10-05T20:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.816545 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.816620 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.816642 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.816671 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.816697 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:35Z","lastTransitionTime":"2025-10-05T20:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.851123 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.851207 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.851233 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.851123 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:35 crc kubenswrapper[4753]: E1005 20:16:35.851347 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:35 crc kubenswrapper[4753]: E1005 20:16:35.851450 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:35 crc kubenswrapper[4753]: E1005 20:16:35.851530 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:35 crc kubenswrapper[4753]: E1005 20:16:35.851772 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.919044 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.919088 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.919097 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.919109 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:35 crc kubenswrapper[4753]: I1005 20:16:35.919117 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:35Z","lastTransitionTime":"2025-10-05T20:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.022152 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.022197 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.022208 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.022226 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.022238 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:36Z","lastTransitionTime":"2025-10-05T20:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.125489 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.125556 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.125574 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.125597 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.125615 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:36Z","lastTransitionTime":"2025-10-05T20:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.228360 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.228438 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.228463 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.228496 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.228518 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:36Z","lastTransitionTime":"2025-10-05T20:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.331307 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.332295 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.332497 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.332684 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.332874 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:36Z","lastTransitionTime":"2025-10-05T20:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.435306 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.435395 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.435452 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.435480 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.435504 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:36Z","lastTransitionTime":"2025-10-05T20:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.538296 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.538635 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.538786 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.538923 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.539065 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:36Z","lastTransitionTime":"2025-10-05T20:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.642072 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.642212 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.642234 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.642259 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.642275 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:36Z","lastTransitionTime":"2025-10-05T20:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.745333 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.745433 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.745456 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.745485 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.745505 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:36Z","lastTransitionTime":"2025-10-05T20:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.849071 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.849478 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.849672 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.849901 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.850089 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:36Z","lastTransitionTime":"2025-10-05T20:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.954024 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.954070 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.954086 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.954107 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:36 crc kubenswrapper[4753]: I1005 20:16:36.954123 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:36Z","lastTransitionTime":"2025-10-05T20:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.056403 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.056430 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.056471 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.056486 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.056495 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:37Z","lastTransitionTime":"2025-10-05T20:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.159132 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.159183 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.159196 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.159211 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.159222 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:37Z","lastTransitionTime":"2025-10-05T20:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.261892 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.261924 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.261934 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.261945 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.261953 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:37Z","lastTransitionTime":"2025-10-05T20:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.363809 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.364436 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.364535 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.364665 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.364869 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:37Z","lastTransitionTime":"2025-10-05T20:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.466849 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.467107 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.467266 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.467375 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.467509 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:37Z","lastTransitionTime":"2025-10-05T20:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.569458 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.569499 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.569510 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.569527 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.569538 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:37Z","lastTransitionTime":"2025-10-05T20:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.671812 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.671854 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.671866 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.671883 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.671895 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:37Z","lastTransitionTime":"2025-10-05T20:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.774442 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.774502 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.774520 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.774543 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.774560 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:37Z","lastTransitionTime":"2025-10-05T20:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.851764 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.851821 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.851821 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.851906 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:37 crc kubenswrapper[4753]: E1005 20:16:37.852315 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:37 crc kubenswrapper[4753]: E1005 20:16:37.852496 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:37 crc kubenswrapper[4753]: E1005 20:16:37.852648 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:37 crc kubenswrapper[4753]: E1005 20:16:37.853012 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.877225 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.877267 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.877276 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.877311 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.877323 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:37Z","lastTransitionTime":"2025-10-05T20:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.980489 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.980554 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.980572 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.980598 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:37 crc kubenswrapper[4753]: I1005 20:16:37.980616 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:37Z","lastTransitionTime":"2025-10-05T20:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.083364 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.083410 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.083424 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.083446 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.083465 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:38Z","lastTransitionTime":"2025-10-05T20:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.186283 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.186343 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.186363 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.186389 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.186411 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:38Z","lastTransitionTime":"2025-10-05T20:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.289400 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.289456 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.289478 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.289502 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.289518 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:38Z","lastTransitionTime":"2025-10-05T20:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.392207 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.392275 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.392290 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.392320 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.392334 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:38Z","lastTransitionTime":"2025-10-05T20:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.493969 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.494180 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.494194 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.494206 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.494214 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:38Z","lastTransitionTime":"2025-10-05T20:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.596978 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.597014 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.597025 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.597040 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.597052 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:38Z","lastTransitionTime":"2025-10-05T20:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.699453 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.699481 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.699489 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.699501 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.699511 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:38Z","lastTransitionTime":"2025-10-05T20:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.801393 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.801475 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.801495 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.801511 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.801563 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:38Z","lastTransitionTime":"2025-10-05T20:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.903248 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.903292 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.903304 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.903321 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:38 crc kubenswrapper[4753]: I1005 20:16:38.903345 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:38Z","lastTransitionTime":"2025-10-05T20:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.005968 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.006003 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.006016 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.006032 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.006043 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:39Z","lastTransitionTime":"2025-10-05T20:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.107953 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.107990 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.108001 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.108016 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.108027 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:39Z","lastTransitionTime":"2025-10-05T20:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.210437 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.210574 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.210598 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.210627 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.210649 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:39Z","lastTransitionTime":"2025-10-05T20:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.313602 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.313635 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.313644 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.313657 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.313668 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:39Z","lastTransitionTime":"2025-10-05T20:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.416114 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.416218 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.416238 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.416265 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.416285 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:39Z","lastTransitionTime":"2025-10-05T20:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.518901 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.519247 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.519433 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.519615 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.519792 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:39Z","lastTransitionTime":"2025-10-05T20:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.622323 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.622354 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.622363 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.622376 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.622385 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:39Z","lastTransitionTime":"2025-10-05T20:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.725176 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.725251 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.725278 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.725308 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.725331 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:39Z","lastTransitionTime":"2025-10-05T20:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.827823 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.827853 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.827861 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.827873 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.827882 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:39Z","lastTransitionTime":"2025-10-05T20:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.849290 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.849326 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.849337 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.849351 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.849360 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:39Z","lastTransitionTime":"2025-10-05T20:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.851973 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.852016 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:39 crc kubenswrapper[4753]: E1005 20:16:39.852090 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.852131 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.852240 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:39 crc kubenswrapper[4753]: E1005 20:16:39.852312 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:39 crc kubenswrapper[4753]: E1005 20:16:39.852365 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:39 crc kubenswrapper[4753]: E1005 20:16:39.852436 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:39 crc kubenswrapper[4753]: E1005 20:16:39.866529 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:39Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.871220 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.871286 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.871297 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.871312 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.871322 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:39Z","lastTransitionTime":"2025-10-05T20:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:39 crc kubenswrapper[4753]: E1005 20:16:39.888243 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:39Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.892195 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.892263 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.892276 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.892324 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.892338 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:39Z","lastTransitionTime":"2025-10-05T20:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:39 crc kubenswrapper[4753]: E1005 20:16:39.909217 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:39Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.913399 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.913426 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.913437 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.913450 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.913460 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:39Z","lastTransitionTime":"2025-10-05T20:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:39 crc kubenswrapper[4753]: E1005 20:16:39.925256 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:39Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.928310 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.928344 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.928353 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.928367 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.928376 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:39Z","lastTransitionTime":"2025-10-05T20:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:39 crc kubenswrapper[4753]: E1005 20:16:39.945853 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fe6c287-7fce-4f83-8f74-f3c461744d43\\\",\\\"systemUUID\\\":\\\"5031572e-89d6-40ea-86fd-ab9d0632be0c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:39Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:39 crc kubenswrapper[4753]: E1005 20:16:39.945971 4753 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.947945 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.947970 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.947979 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.947992 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:39 crc kubenswrapper[4753]: I1005 20:16:39.948003 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:39Z","lastTransitionTime":"2025-10-05T20:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.050902 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.050968 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.050987 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.051011 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.051028 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:40Z","lastTransitionTime":"2025-10-05T20:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.153352 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.153417 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.153431 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.153448 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.153462 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:40Z","lastTransitionTime":"2025-10-05T20:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.256886 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.256927 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.256938 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.256954 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.256968 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:40Z","lastTransitionTime":"2025-10-05T20:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.359951 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.360028 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.360042 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.360058 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.360069 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:40Z","lastTransitionTime":"2025-10-05T20:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.462126 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.462237 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.462261 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.462292 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.462312 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:40Z","lastTransitionTime":"2025-10-05T20:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.564535 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.564589 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.564600 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.564614 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.564647 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:40Z","lastTransitionTime":"2025-10-05T20:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.672299 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.672365 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.673053 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.673130 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.673176 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:40Z","lastTransitionTime":"2025-10-05T20:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.775533 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.775571 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.775581 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.775593 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.775602 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:40Z","lastTransitionTime":"2025-10-05T20:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.878378 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.878445 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.878473 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.878504 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.878528 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:40Z","lastTransitionTime":"2025-10-05T20:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.981232 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.981297 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.981314 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.981340 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:40 crc kubenswrapper[4753]: I1005 20:16:40.981367 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:40Z","lastTransitionTime":"2025-10-05T20:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.084963 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.085021 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.085040 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.085069 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.085089 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:41Z","lastTransitionTime":"2025-10-05T20:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.188658 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.188708 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.188720 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.188738 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.188751 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:41Z","lastTransitionTime":"2025-10-05T20:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.292317 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.292371 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.292389 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.292413 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.292430 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:41Z","lastTransitionTime":"2025-10-05T20:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.395070 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.395130 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.395188 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.395222 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.395243 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:41Z","lastTransitionTime":"2025-10-05T20:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.497557 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.497607 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.497625 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.497647 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.497666 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:41Z","lastTransitionTime":"2025-10-05T20:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.600440 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.600521 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.600545 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.600573 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.600594 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:41Z","lastTransitionTime":"2025-10-05T20:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.703483 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.703546 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.703564 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.703588 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.703606 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:41Z","lastTransitionTime":"2025-10-05T20:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.806133 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.806228 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.806245 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.806268 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.806284 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:41Z","lastTransitionTime":"2025-10-05T20:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.851652 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.851825 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:41 crc kubenswrapper[4753]: E1005 20:16:41.851829 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.851923 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.852032 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:41 crc kubenswrapper[4753]: E1005 20:16:41.852057 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:41 crc kubenswrapper[4753]: E1005 20:16:41.852173 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:41 crc kubenswrapper[4753]: E1005 20:16:41.852350 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.877528 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25jcm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bbfd1eb-16b3-420d-acab-5770837c14fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3ba1bcbbe8828e0aa5852e6aee06604a0ba54694db71d8573eba1f600c2510\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfad3fbc279c28bc73aca70d8a819d90a520715e8f03f84674baa6f7c3c52b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404349f50785a9c69151a945e3650af70c449a18ad649ad738e0041d8f20f1ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6936ab1be96afa66fb55bb739756b0b6d2f8f93f5be1b2bf615e33d8cf0a0192\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b04fbd49306f156eb7f61ff9683a4364c6dfeb0c74b36f6c8c42fabb573005\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29115e14e98790f22e3db7c52bc306e8f06c07fb64520b910fed5ec28e6c5e51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec30829ce3d0e4e78273a50051dbd47dc4be531c5a43b656d999c6717cadd694\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kd8z9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25jcm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:41Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.897563 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8eda6d8e-f029-46df-8965-bb8302a46cf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://886bd9047fcbcf22ed1d3cb28cc5cc47b460aa98d899468964e5f073c151a07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b97b8bcd309c5061e240f1a7e6a3eba6aa6d8178dbfbe3e4cef025e7dd2e6a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w95mm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88v8l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:41Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.909187 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.909267 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.909293 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.909327 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.909349 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:41Z","lastTransitionTime":"2025-10-05T20:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.932107 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4af6e01-6222-4b13-9f88-e9667cc076d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b23c13ec0f79486f1f239164b8ef9cfccf612b25349546cdcb47dcc797acae1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b934eb95fb0ebd4eef3cbd8edadc0df5cd8c2e6f2c2559ed3441081869c0bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b000406bb8af6650fe0b747727019be9448e6c19b6469e01a2355933688e105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0198c51cab04bd8c2b4c225c54462dfaf7a92a00aa357463cef492efa09a69bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49a91a72e74f6ab35cb62c18fad465ec6fe891f9802978677f3ef0f1d1b4ddd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e86b41fe3af548448915a06369be226f2973795d2ac7b4da89e5611d9fdc548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e86b41fe3af548448915a06369be226f2973795d2ac7b4da89e5611d9fdc548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc9957b862552c9cbaca63673c87e9d088e99264455f9e3db8b572dbe9be0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fefc9957b862552c9cbaca63673c87e9d088e99264455f9e3db8b572dbe9be0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://60975f522caf98398fb75fe07cf9f73203a2000860de4ca7bba29b376817270a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60975f522caf98398fb75fe07cf9f73203a2000860de4ca7bba29b376817270a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:41Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.951919 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b76c0217-b467-4806-8933-4afa084c51e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://359ee28cd996a4068246945cb7d6b0cb2921ea96607659b72f61679d8e819242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3862dca9a162732558de66abc59d827f07c5ff1bd8a51d37d1799b60ba8b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3862dca9a162732558de66abc59d827f07c5ff1bd8a51d37d1799b60ba8b77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:41Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.973347 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd32ee873f06b2865d4379152292b1420723a2060f9b3d70b20c133ed4b5d598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:41Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:41 crc kubenswrapper[4753]: I1005 20:16:41.995533 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr5q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a6cead6-0872-4b49-a08c-529805f646f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:16:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7ac7d15272eb1688a6a443c556a4ef57930c430084243bdff7df0d017044402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:16:02Z\\\",\\\"message\\\":\\\"2025-10-05T20:15:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b\\\\n2025-10-05T20:15:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_681b2a9f-0304-41bb-a1e6-e9bfd787566b to /host/opt/cni/bin/\\\\n2025-10-05T20:15:17Z [verbose] multus-daemon started\\\\n2025-10-05T20:15:17Z [verbose] Readiness Indicator file check\\\\n2025-10-05T20:16:02Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:16:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pj69\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr5q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:41Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.012964 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.013023 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.013041 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.013065 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.013081 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:42Z","lastTransitionTime":"2025-10-05T20:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.017761 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90f6e089-94fd-41e8-8e9d-562d89f02769\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cffc198d1363b46309c7f837500868b5e2ea1ce5539876bf8d2475088eef4fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df8baaad5856ffbe7e11b8fcb9912d2b12bd176db4a8052d6af86c11fc7fe7cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eaa022045922053838278fa90b9604816253720b89b03a863f1f3d6b9accfdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec25263d4cf23fce66c516f6f8049149f0fefec42b37386cb3c8929add490780\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b0576ffb5d29cb842a1405e9b4945037fbe2b6037d1bcdf0ae82f3dadb948b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW1005 20:15:11.540486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1005 20:15:11.540583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1005 20:15:11.541256 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2401338286/tls.crt::/tmp/serving-cert-2401338286/tls.key\\\\\\\"\\\\nI1005 20:15:12.027382 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1005 20:15:12.031771 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1005 20:15:12.031790 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1005 20:15:12.031812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1005 20:15:12.031819 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1005 20:15:12.036967 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1005 20:15:12.036984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1005 20:15:12.036993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI1005 20:15:12.036991 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1005 20:15:12.036997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1005 20:15:12.037008 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1005 20:15:12.037014 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1005 20:15:12.039564 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13817f3ce200b5822501d6e64c3f621d0fcb4c1ba012805ded007b3f581bb315\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4102e125d445032c47b0e4a4d2948d4569badebb69ed32ad052d1347d460543\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.038082 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0719a6e3-7990-452d-8f9a-e2d0751e646b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ab581c0e1a47539f9cf7f23292b00139febee58ecc8af088eb235b95f14b46e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0ba293055947cd9ac3ad8cc4aeceebc4725d3110c12a6e215e886eb03a0fcb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b59d6146818fe2dd97626ea95ad122a3ad70a3930a5bde73baf024103a8e5a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bff3c73d963054c7d8b06233dc8722f126059dc0ad47fe64fc6b2ebf644ab7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.061313 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.091014 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-05T20:16:14Z\\\",\\\"message\\\":\\\"l_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-authentication-operator/metrics]} name:Service_openshift-authentication-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1005 20:16:14.785697 6719 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:14Z is after 2025-08-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-05T20:16:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7j8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-htbfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.112518 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7b2637f-73a4-4dbb-b1f0-2ed149e34157\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3162645925f249750ae6ef0136b28459a949ab5097da9ac48841312a0c1cc43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf427d5366f0edf753e57bb958b0e30b93716e0d46779a5c6fcca2db6794c868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1629003efbc02c7e0f18f7960a8002d9834606d268109b5986ac3c9ef3d004b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b2dc80228eed5c17aefada1550ab75a01756bd3b08f04c6d5f54ce6e184583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-05T20:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-05T20:14:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:14:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.118816 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.118893 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.118915 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.119290 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.119514 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:42Z","lastTransitionTime":"2025-10-05T20:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.134557 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.153390 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.170998 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bcrnh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f4dc067-d682-4823-8969-64a0184e623d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e981c0be75e65e33ec85d767f98fe4d855a6449d3275a13cc17ea7c4b81914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4l9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bcrnh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.190212 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ktspr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f99b8ef3-70ed-42e4-9217-a300fcd562d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7dct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ktspr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.211936 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878db6f40d1ee63f21ebad9bb3e922b264bc67f791c240880e99d369f73cd78d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.222315 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.222379 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.222398 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.222423 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.222441 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:42Z","lastTransitionTime":"2025-10-05T20:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.231461 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6957a87d4f411a9f46039fb0a04493e328ca4d16bb353252c4707b6591538e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd57e1cf6ef8ee0e282b571aba8ca2df692d91ed7edd317076b958a7799c02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.248728 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a422d983-1769-4d79-9e71-b63bef552d37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd282db68bc742f7247621a410ae7ebba943557df5df6fcc65efc1628901be9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rd6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xlrkd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.261455 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v2pmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a89c4e01-7849-478a-bf2b-07701a1c5ef3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-05T20:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb9c7f4345d68a61a807d6744ba62dd8987458d6b17effc0557a60d93cf5ed97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-05T20:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6chvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-05T20:15:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v2pmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-05T20:16:42Z is after 2025-08-24T17:21:41Z" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.325089 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.325197 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.325225 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.325259 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.325286 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:42Z","lastTransitionTime":"2025-10-05T20:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.428484 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.428588 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.428613 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.428646 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.428670 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:42Z","lastTransitionTime":"2025-10-05T20:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.530606 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.530644 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.530652 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.530665 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.530674 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:42Z","lastTransitionTime":"2025-10-05T20:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.634012 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.634054 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.634064 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.634079 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.634089 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:42Z","lastTransitionTime":"2025-10-05T20:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.736132 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.736187 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.736197 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.736213 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.736224 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:42Z","lastTransitionTime":"2025-10-05T20:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.839346 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.839414 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.839437 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.839487 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.839510 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:42Z","lastTransitionTime":"2025-10-05T20:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.942772 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.942828 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.942850 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.942880 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:42 crc kubenswrapper[4753]: I1005 20:16:42.942902 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:42Z","lastTransitionTime":"2025-10-05T20:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.045543 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.045613 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.045639 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.045668 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.045690 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:43Z","lastTransitionTime":"2025-10-05T20:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.149077 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.149490 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.149729 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.149950 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.150187 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:43Z","lastTransitionTime":"2025-10-05T20:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.253831 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.253886 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.253904 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.253930 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.253950 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:43Z","lastTransitionTime":"2025-10-05T20:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.357743 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.357807 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.357826 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.357853 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.357874 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:43Z","lastTransitionTime":"2025-10-05T20:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.460701 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.460746 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.460763 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.460787 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.460804 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:43Z","lastTransitionTime":"2025-10-05T20:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.563664 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.563728 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.563746 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.563770 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.563790 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:43Z","lastTransitionTime":"2025-10-05T20:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.666849 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.666943 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.666963 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.666994 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.667015 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:43Z","lastTransitionTime":"2025-10-05T20:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.770380 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.770475 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.770496 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.770535 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.770560 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:43Z","lastTransitionTime":"2025-10-05T20:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.851231 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:43 crc kubenswrapper[4753]: E1005 20:16:43.851354 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.852021 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.852131 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.852206 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:43 crc kubenswrapper[4753]: E1005 20:16:43.853003 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:43 crc kubenswrapper[4753]: E1005 20:16:43.853190 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:43 crc kubenswrapper[4753]: E1005 20:16:43.853519 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.853862 4753 scope.go:117] "RemoveContainer" containerID="84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16" Oct 05 20:16:43 crc kubenswrapper[4753]: E1005 20:16:43.854261 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-htbfn_openshift-ovn-kubernetes(fa1e6bd4-ce05-4757-bab2-6addb9d0111e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.873698 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.873737 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.873753 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.873771 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.873785 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:43Z","lastTransitionTime":"2025-10-05T20:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.976564 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.976627 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.976646 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.976674 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:43 crc kubenswrapper[4753]: I1005 20:16:43.976693 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:43Z","lastTransitionTime":"2025-10-05T20:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.080459 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.080516 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.080533 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.080558 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.080576 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:44Z","lastTransitionTime":"2025-10-05T20:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.183527 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.183612 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.183638 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.183665 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.183682 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:44Z","lastTransitionTime":"2025-10-05T20:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.287231 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.287645 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.287806 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.287976 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.288122 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:44Z","lastTransitionTime":"2025-10-05T20:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.391869 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.391924 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.391943 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.391969 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.391986 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:44Z","lastTransitionTime":"2025-10-05T20:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.494970 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.495022 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.495034 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.495052 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.495065 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:44Z","lastTransitionTime":"2025-10-05T20:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.597536 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.597611 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.597630 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.597650 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.597665 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:44Z","lastTransitionTime":"2025-10-05T20:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.700911 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.700988 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.701012 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.701042 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.701064 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:44Z","lastTransitionTime":"2025-10-05T20:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.804305 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.804377 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.804390 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.804406 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.804417 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:44Z","lastTransitionTime":"2025-10-05T20:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.907621 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.907667 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.907679 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.907698 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:44 crc kubenswrapper[4753]: I1005 20:16:44.907711 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:44Z","lastTransitionTime":"2025-10-05T20:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.010944 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.010984 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.011001 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.011020 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.011037 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:45Z","lastTransitionTime":"2025-10-05T20:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.113718 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.113771 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.113788 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.113847 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.113865 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:45Z","lastTransitionTime":"2025-10-05T20:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.216669 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.216723 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.216742 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.216766 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.216783 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:45Z","lastTransitionTime":"2025-10-05T20:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.319796 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.319858 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.319881 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.319913 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.319938 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:45Z","lastTransitionTime":"2025-10-05T20:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.423611 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.423646 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.423662 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.423681 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.423695 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:45Z","lastTransitionTime":"2025-10-05T20:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.525766 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.525821 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.525839 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.525861 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.525878 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:45Z","lastTransitionTime":"2025-10-05T20:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.629470 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.629511 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.629527 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.629549 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.629562 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:45Z","lastTransitionTime":"2025-10-05T20:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.731980 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.732021 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.732035 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.732051 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.732063 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:45Z","lastTransitionTime":"2025-10-05T20:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.835690 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.835738 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.835754 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.835777 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.835789 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:45Z","lastTransitionTime":"2025-10-05T20:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.851275 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.851345 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:45 crc kubenswrapper[4753]: E1005 20:16:45.851389 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:45 crc kubenswrapper[4753]: E1005 20:16:45.851499 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.851276 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:45 crc kubenswrapper[4753]: E1005 20:16:45.851674 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.852016 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:45 crc kubenswrapper[4753]: E1005 20:16:45.852320 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.937977 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.938040 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.938064 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.938094 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:45 crc kubenswrapper[4753]: I1005 20:16:45.938120 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:45Z","lastTransitionTime":"2025-10-05T20:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.040382 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.040640 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.040756 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.040849 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.040932 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:46Z","lastTransitionTime":"2025-10-05T20:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.143682 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.144009 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.144111 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.144241 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.144334 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:46Z","lastTransitionTime":"2025-10-05T20:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.246868 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.246939 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.246959 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.246983 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.247000 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:46Z","lastTransitionTime":"2025-10-05T20:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.349970 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.349999 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.350015 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.350028 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.350038 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:46Z","lastTransitionTime":"2025-10-05T20:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.454689 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.454751 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.454770 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.454797 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.454816 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:46Z","lastTransitionTime":"2025-10-05T20:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.558374 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.558437 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.558454 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.558481 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.558501 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:46Z","lastTransitionTime":"2025-10-05T20:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.662080 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.662175 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.662199 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.662227 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.662245 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:46Z","lastTransitionTime":"2025-10-05T20:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.765760 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.765819 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.765838 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.765868 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.765885 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:46Z","lastTransitionTime":"2025-10-05T20:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.869371 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.869417 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.869430 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.869448 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.869461 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:46Z","lastTransitionTime":"2025-10-05T20:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.971905 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.971964 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.971984 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.972009 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:46 crc kubenswrapper[4753]: I1005 20:16:46.972027 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:46Z","lastTransitionTime":"2025-10-05T20:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.075296 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.075364 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.075384 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.075408 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.075425 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:47Z","lastTransitionTime":"2025-10-05T20:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.178588 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.178951 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.179051 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.179191 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.179333 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:47Z","lastTransitionTime":"2025-10-05T20:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.282472 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.282524 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.282541 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.282565 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.282582 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:47Z","lastTransitionTime":"2025-10-05T20:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.385582 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.385633 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.385646 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.385663 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.385675 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:47Z","lastTransitionTime":"2025-10-05T20:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.488576 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.488641 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.488665 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.488693 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.488715 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:47Z","lastTransitionTime":"2025-10-05T20:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.592026 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.592094 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.592113 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.592170 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.592190 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:47Z","lastTransitionTime":"2025-10-05T20:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.695678 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.695725 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.695738 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.695756 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.695769 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:47Z","lastTransitionTime":"2025-10-05T20:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.798548 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.798829 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.798842 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.798862 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.798877 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:47Z","lastTransitionTime":"2025-10-05T20:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.852222 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:47 crc kubenswrapper[4753]: E1005 20:16:47.852402 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.852675 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:47 crc kubenswrapper[4753]: E1005 20:16:47.852846 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.853068 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:47 crc kubenswrapper[4753]: E1005 20:16:47.853195 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.853397 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:47 crc kubenswrapper[4753]: E1005 20:16:47.853492 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.901296 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.901332 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.901346 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.901363 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:47 crc kubenswrapper[4753]: I1005 20:16:47.901374 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:47Z","lastTransitionTime":"2025-10-05T20:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.004328 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.004391 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.004477 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.004546 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.004578 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:48Z","lastTransitionTime":"2025-10-05T20:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.107543 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.107600 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.107616 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.107644 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.107664 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:48Z","lastTransitionTime":"2025-10-05T20:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.210844 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.210915 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.210936 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.210960 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.210978 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:48Z","lastTransitionTime":"2025-10-05T20:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.313678 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.313721 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.313732 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.313819 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.313834 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:48Z","lastTransitionTime":"2025-10-05T20:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.417101 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.417211 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.417236 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.417267 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.417294 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:48Z","lastTransitionTime":"2025-10-05T20:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.463028 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr5q8_8a6cead6-0872-4b49-a08c-529805f646f2/kube-multus/1.log" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.463823 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr5q8_8a6cead6-0872-4b49-a08c-529805f646f2/kube-multus/0.log" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.463922 4753 generic.go:334] "Generic (PLEG): container finished" podID="8a6cead6-0872-4b49-a08c-529805f646f2" containerID="d7ac7d15272eb1688a6a443c556a4ef57930c430084243bdff7df0d017044402" exitCode=1 Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.463965 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr5q8" event={"ID":"8a6cead6-0872-4b49-a08c-529805f646f2","Type":"ContainerDied","Data":"d7ac7d15272eb1688a6a443c556a4ef57930c430084243bdff7df0d017044402"} Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.464015 4753 scope.go:117] "RemoveContainer" containerID="a6648fdcfc20641c45091b69d9b7ea8cf0cf1b45aa377579a82602867862e039" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.464582 4753 scope.go:117] "RemoveContainer" containerID="d7ac7d15272eb1688a6a443c556a4ef57930c430084243bdff7df0d017044402" Oct 05 20:16:48 crc kubenswrapper[4753]: E1005 20:16:48.464781 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-zr5q8_openshift-multus(8a6cead6-0872-4b49-a08c-529805f646f2)\"" pod="openshift-multus/multus-zr5q8" podUID="8a6cead6-0872-4b49-a08c-529805f646f2" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.520043 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.520091 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.520100 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.520113 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.520121 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:48Z","lastTransitionTime":"2025-10-05T20:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.524993 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-25jcm" podStartSLOduration=95.524966676 podStartE2EDuration="1m35.524966676s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:16:48.524059478 +0000 UTC m=+117.372387750" watchObservedRunningTime="2025-10-05 20:16:48.524966676 +0000 UTC m=+117.373294948" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.545707 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88v8l" podStartSLOduration=94.545689694 podStartE2EDuration="1m34.545689694s" podCreationTimestamp="2025-10-05 20:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:16:48.544471797 +0000 UTC m=+117.392800029" watchObservedRunningTime="2025-10-05 20:16:48.545689694 +0000 UTC m=+117.394017926" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.573806 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=29.573790526 podStartE2EDuration="29.573790526s" podCreationTimestamp="2025-10-05 20:16:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:16:48.572283901 +0000 UTC m=+117.420612133" watchObservedRunningTime="2025-10-05 20:16:48.573790526 +0000 UTC m=+117.422118758" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.605765 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=45.605753315 podStartE2EDuration="45.605753315s" podCreationTimestamp="2025-10-05 20:16:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:16:48.589572284 +0000 UTC m=+117.437900556" watchObservedRunningTime="2025-10-05 20:16:48.605753315 +0000 UTC m=+117.454081547" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.622503 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.622524 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.622531 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.622544 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.622553 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:48Z","lastTransitionTime":"2025-10-05T20:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.686012 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=96.685985857 podStartE2EDuration="1m36.685985857s" podCreationTimestamp="2025-10-05 20:15:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:16:48.685180893 +0000 UTC m=+117.533509165" watchObservedRunningTime="2025-10-05 20:16:48.685985857 +0000 UTC m=+117.534314119" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.686346 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=96.686331097 podStartE2EDuration="1m36.686331097s" podCreationTimestamp="2025-10-05 20:15:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:16:48.664120374 +0000 UTC m=+117.512448646" watchObservedRunningTime="2025-10-05 20:16:48.686331097 +0000 UTC m=+117.534659359" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.724854 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.724890 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.724901 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.724916 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.724928 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:48Z","lastTransitionTime":"2025-10-05T20:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.736921 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=66.73689889 podStartE2EDuration="1m6.73689889s" podCreationTimestamp="2025-10-05 20:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:16:48.736208819 +0000 UTC m=+117.584537061" watchObservedRunningTime="2025-10-05 20:16:48.73689889 +0000 UTC m=+117.585227142" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.737107 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bcrnh" podStartSLOduration=95.737101586 podStartE2EDuration="1m35.737101586s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:16:48.717906584 +0000 UTC m=+117.566234856" watchObservedRunningTime="2025-10-05 20:16:48.737101586 +0000 UTC m=+117.585429828" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.785612 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-v2pmn" podStartSLOduration=95.785589966 podStartE2EDuration="1m35.785589966s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:16:48.784652848 +0000 UTC m=+117.632981090" watchObservedRunningTime="2025-10-05 20:16:48.785589966 +0000 UTC m=+117.633918218" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.827357 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.827394 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.827406 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.827441 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.827452 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:48Z","lastTransitionTime":"2025-10-05T20:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.845113 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podStartSLOduration=95.84509502 podStartE2EDuration="1m35.84509502s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:16:48.84444516 +0000 UTC m=+117.692773392" watchObservedRunningTime="2025-10-05 20:16:48.84509502 +0000 UTC m=+117.693423262" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.929691 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.929725 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.929735 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.929749 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:48 crc kubenswrapper[4753]: I1005 20:16:48.929758 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:48Z","lastTransitionTime":"2025-10-05T20:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.032671 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.032701 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.032713 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.032729 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.032743 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:49Z","lastTransitionTime":"2025-10-05T20:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.137099 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.137238 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.137257 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.137279 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.137297 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:49Z","lastTransitionTime":"2025-10-05T20:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.240450 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.240525 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.240546 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.240571 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.240589 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:49Z","lastTransitionTime":"2025-10-05T20:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.343090 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.343175 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.343193 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.343222 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.343239 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:49Z","lastTransitionTime":"2025-10-05T20:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.445900 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.445974 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.446000 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.446031 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.446057 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:49Z","lastTransitionTime":"2025-10-05T20:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.471084 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr5q8_8a6cead6-0872-4b49-a08c-529805f646f2/kube-multus/1.log" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.549310 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.549371 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.549388 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.549413 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.549432 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:49Z","lastTransitionTime":"2025-10-05T20:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.652847 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.652913 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.652933 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.652958 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.652975 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:49Z","lastTransitionTime":"2025-10-05T20:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.757084 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.757233 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.757259 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.757284 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.757303 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:49Z","lastTransitionTime":"2025-10-05T20:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.851394 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.851431 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.851394 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:49 crc kubenswrapper[4753]: E1005 20:16:49.851617 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:49 crc kubenswrapper[4753]: E1005 20:16:49.851727 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.851746 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:49 crc kubenswrapper[4753]: E1005 20:16:49.851862 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:49 crc kubenswrapper[4753]: E1005 20:16:49.852023 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.860292 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.860356 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.860375 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.860397 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.860453 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:49Z","lastTransitionTime":"2025-10-05T20:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.963034 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.963073 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.963083 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.963097 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:49 crc kubenswrapper[4753]: I1005 20:16:49.963114 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:49Z","lastTransitionTime":"2025-10-05T20:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.066064 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.066106 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.066120 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.066155 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.066169 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:50Z","lastTransitionTime":"2025-10-05T20:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.118442 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.118481 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.118491 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.118508 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.118518 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-05T20:16:50Z","lastTransitionTime":"2025-10-05T20:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.158625 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h"] Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.159017 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:50 crc kubenswrapper[4753]: W1005 20:16:50.160544 4753 reflector.go:561] object-"openshift-cluster-version"/"default-dockercfg-gxtc4": failed to list *v1.Secret: secrets "default-dockercfg-gxtc4" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'crc' and this object Oct 05 20:16:50 crc kubenswrapper[4753]: W1005 20:16:50.160605 4753 reflector.go:561] object-"openshift-cluster-version"/"cluster-version-operator-serving-cert": failed to list *v1.Secret: secrets "cluster-version-operator-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'crc' and this object Oct 05 20:16:50 crc kubenswrapper[4753]: E1005 20:16:50.160636 4753 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"cluster-version-operator-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cluster-version-operator-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 05 20:16:50 crc kubenswrapper[4753]: E1005 20:16:50.160602 4753 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"default-dockercfg-gxtc4\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-gxtc4\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 05 20:16:50 crc kubenswrapper[4753]: W1005 20:16:50.160663 4753 reflector.go:561] object-"openshift-cluster-version"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'crc' and this object Oct 05 20:16:50 crc kubenswrapper[4753]: E1005 20:16:50.160686 4753 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 05 20:16:50 crc kubenswrapper[4753]: W1005 20:16:50.160792 4753 reflector.go:561] object-"openshift-cluster-version"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'crc' and this object Oct 05 20:16:50 crc kubenswrapper[4753]: E1005 20:16:50.160808 4753 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.279005 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/799615a8-8722-4211-9c80-cc9b66998f2e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6rp5h\" (UID: \"799615a8-8722-4211-9c80-cc9b66998f2e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.279041 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/799615a8-8722-4211-9c80-cc9b66998f2e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6rp5h\" (UID: \"799615a8-8722-4211-9c80-cc9b66998f2e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.279059 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/799615a8-8722-4211-9c80-cc9b66998f2e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6rp5h\" (UID: \"799615a8-8722-4211-9c80-cc9b66998f2e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.279287 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/799615a8-8722-4211-9c80-cc9b66998f2e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6rp5h\" (UID: \"799615a8-8722-4211-9c80-cc9b66998f2e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.279340 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/799615a8-8722-4211-9c80-cc9b66998f2e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6rp5h\" (UID: \"799615a8-8722-4211-9c80-cc9b66998f2e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.380815 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/799615a8-8722-4211-9c80-cc9b66998f2e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6rp5h\" (UID: \"799615a8-8722-4211-9c80-cc9b66998f2e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.380850 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/799615a8-8722-4211-9c80-cc9b66998f2e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6rp5h\" (UID: \"799615a8-8722-4211-9c80-cc9b66998f2e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.380872 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/799615a8-8722-4211-9c80-cc9b66998f2e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6rp5h\" (UID: \"799615a8-8722-4211-9c80-cc9b66998f2e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.380926 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/799615a8-8722-4211-9c80-cc9b66998f2e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6rp5h\" (UID: \"799615a8-8722-4211-9c80-cc9b66998f2e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.380945 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/799615a8-8722-4211-9c80-cc9b66998f2e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6rp5h\" (UID: \"799615a8-8722-4211-9c80-cc9b66998f2e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.380981 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/799615a8-8722-4211-9c80-cc9b66998f2e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6rp5h\" (UID: \"799615a8-8722-4211-9c80-cc9b66998f2e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:50 crc kubenswrapper[4753]: I1005 20:16:50.381023 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/799615a8-8722-4211-9c80-cc9b66998f2e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6rp5h\" (UID: \"799615a8-8722-4211-9c80-cc9b66998f2e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:51 crc kubenswrapper[4753]: I1005 20:16:51.001704 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 05 20:16:51 crc kubenswrapper[4753]: I1005 20:16:51.017561 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/799615a8-8722-4211-9c80-cc9b66998f2e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6rp5h\" (UID: \"799615a8-8722-4211-9c80-cc9b66998f2e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:51 crc kubenswrapper[4753]: I1005 20:16:51.120332 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 05 20:16:51 crc kubenswrapper[4753]: I1005 20:16:51.125981 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/799615a8-8722-4211-9c80-cc9b66998f2e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6rp5h\" (UID: \"799615a8-8722-4211-9c80-cc9b66998f2e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:51 crc kubenswrapper[4753]: E1005 20:16:51.382401 4753 configmap.go:193] Couldn't get configMap openshift-cluster-version/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 05 20:16:51 crc kubenswrapper[4753]: E1005 20:16:51.382544 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/799615a8-8722-4211-9c80-cc9b66998f2e-service-ca podName:799615a8-8722-4211-9c80-cc9b66998f2e nodeName:}" failed. No retries permitted until 2025-10-05 20:16:51.882513307 +0000 UTC m=+120.730841579 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca" (UniqueName: "kubernetes.io/configmap/799615a8-8722-4211-9c80-cc9b66998f2e-service-ca") pod "cluster-version-operator-5c965bbfc6-6rp5h" (UID: "799615a8-8722-4211-9c80-cc9b66998f2e") : failed to sync configmap cache: timed out waiting for the condition Oct 05 20:16:51 crc kubenswrapper[4753]: I1005 20:16:51.559575 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 05 20:16:51 crc kubenswrapper[4753]: I1005 20:16:51.741613 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 05 20:16:51 crc kubenswrapper[4753]: E1005 20:16:51.819929 4753 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 05 20:16:51 crc kubenswrapper[4753]: I1005 20:16:51.851895 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:51 crc kubenswrapper[4753]: I1005 20:16:51.852007 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:51 crc kubenswrapper[4753]: E1005 20:16:51.854626 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:51 crc kubenswrapper[4753]: I1005 20:16:51.854666 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:51 crc kubenswrapper[4753]: I1005 20:16:51.854710 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:51 crc kubenswrapper[4753]: E1005 20:16:51.854870 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:51 crc kubenswrapper[4753]: E1005 20:16:51.855020 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:51 crc kubenswrapper[4753]: E1005 20:16:51.855122 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:51 crc kubenswrapper[4753]: I1005 20:16:51.897918 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/799615a8-8722-4211-9c80-cc9b66998f2e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6rp5h\" (UID: \"799615a8-8722-4211-9c80-cc9b66998f2e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:51 crc kubenswrapper[4753]: I1005 20:16:51.899283 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/799615a8-8722-4211-9c80-cc9b66998f2e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6rp5h\" (UID: \"799615a8-8722-4211-9c80-cc9b66998f2e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:51 crc kubenswrapper[4753]: E1005 20:16:51.941023 4753 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 05 20:16:51 crc kubenswrapper[4753]: I1005 20:16:51.974086 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 05 20:16:51 crc kubenswrapper[4753]: I1005 20:16:51.982061 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" Oct 05 20:16:52 crc kubenswrapper[4753]: I1005 20:16:52.483705 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" event={"ID":"799615a8-8722-4211-9c80-cc9b66998f2e","Type":"ContainerStarted","Data":"d981ca1e75765e03dedf70c74fb6399daf8096f815a2588f5bd10fe3cf57d387"} Oct 05 20:16:52 crc kubenswrapper[4753]: I1005 20:16:52.483767 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" event={"ID":"799615a8-8722-4211-9c80-cc9b66998f2e","Type":"ContainerStarted","Data":"52ffcd61eb1dd6b7bbb72caeb73df991b925f1c3399e5e96132b9a4da61085c8"} Oct 05 20:16:53 crc kubenswrapper[4753]: I1005 20:16:53.851967 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:53 crc kubenswrapper[4753]: E1005 20:16:53.852129 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:53 crc kubenswrapper[4753]: I1005 20:16:53.851985 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:53 crc kubenswrapper[4753]: I1005 20:16:53.852264 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:53 crc kubenswrapper[4753]: E1005 20:16:53.852352 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:53 crc kubenswrapper[4753]: E1005 20:16:53.852455 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:53 crc kubenswrapper[4753]: I1005 20:16:53.852469 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:53 crc kubenswrapper[4753]: E1005 20:16:53.852548 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:55 crc kubenswrapper[4753]: I1005 20:16:55.851714 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:55 crc kubenswrapper[4753]: I1005 20:16:55.851752 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:55 crc kubenswrapper[4753]: I1005 20:16:55.851782 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:55 crc kubenswrapper[4753]: I1005 20:16:55.851847 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:55 crc kubenswrapper[4753]: E1005 20:16:55.851839 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:55 crc kubenswrapper[4753]: E1005 20:16:55.851919 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:55 crc kubenswrapper[4753]: E1005 20:16:55.852036 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:55 crc kubenswrapper[4753]: E1005 20:16:55.852079 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:56 crc kubenswrapper[4753]: I1005 20:16:56.851898 4753 scope.go:117] "RemoveContainer" containerID="84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16" Oct 05 20:16:56 crc kubenswrapper[4753]: E1005 20:16:56.942587 4753 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 05 20:16:57 crc kubenswrapper[4753]: I1005 20:16:57.504796 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovnkube-controller/3.log" Oct 05 20:16:57 crc kubenswrapper[4753]: I1005 20:16:57.506716 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerStarted","Data":"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11"} Oct 05 20:16:57 crc kubenswrapper[4753]: I1005 20:16:57.507922 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:16:57 crc kubenswrapper[4753]: I1005 20:16:57.541461 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podStartSLOduration=104.541444403 podStartE2EDuration="1m44.541444403s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:16:57.541223606 +0000 UTC m=+126.389551848" watchObservedRunningTime="2025-10-05 20:16:57.541444403 +0000 UTC m=+126.389772645" Oct 05 20:16:57 crc kubenswrapper[4753]: I1005 20:16:57.541802 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6rp5h" podStartSLOduration=104.541796344 podStartE2EDuration="1m44.541796344s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:16:52.508176959 +0000 UTC m=+121.356505231" watchObservedRunningTime="2025-10-05 20:16:57.541796344 +0000 UTC m=+126.390124576" Oct 05 20:16:57 crc kubenswrapper[4753]: I1005 20:16:57.776564 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ktspr"] Oct 05 20:16:57 crc kubenswrapper[4753]: I1005 20:16:57.776758 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:57 crc kubenswrapper[4753]: E1005 20:16:57.776916 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:16:57 crc kubenswrapper[4753]: I1005 20:16:57.851684 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:57 crc kubenswrapper[4753]: E1005 20:16:57.851819 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:57 crc kubenswrapper[4753]: I1005 20:16:57.852018 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:57 crc kubenswrapper[4753]: E1005 20:16:57.852094 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:57 crc kubenswrapper[4753]: I1005 20:16:57.852327 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:57 crc kubenswrapper[4753]: E1005 20:16:57.852382 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:59 crc kubenswrapper[4753]: I1005 20:16:59.851896 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:16:59 crc kubenswrapper[4753]: I1005 20:16:59.851956 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:16:59 crc kubenswrapper[4753]: I1005 20:16:59.852557 4753 scope.go:117] "RemoveContainer" containerID="d7ac7d15272eb1688a6a443c556a4ef57930c430084243bdff7df0d017044402" Oct 05 20:16:59 crc kubenswrapper[4753]: I1005 20:16:59.851956 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:16:59 crc kubenswrapper[4753]: I1005 20:16:59.852076 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:16:59 crc kubenswrapper[4753]: E1005 20:16:59.852688 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:16:59 crc kubenswrapper[4753]: E1005 20:16:59.852577 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:16:59 crc kubenswrapper[4753]: E1005 20:16:59.852777 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:16:59 crc kubenswrapper[4753]: E1005 20:16:59.852869 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:17:00 crc kubenswrapper[4753]: I1005 20:17:00.518225 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr5q8_8a6cead6-0872-4b49-a08c-529805f646f2/kube-multus/1.log" Oct 05 20:17:00 crc kubenswrapper[4753]: I1005 20:17:00.518289 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr5q8" event={"ID":"8a6cead6-0872-4b49-a08c-529805f646f2","Type":"ContainerStarted","Data":"9909c718bf28ac2a716f30feb9dbce15eba410ec607a82b1fa8faf6328b099a0"} Oct 05 20:17:00 crc kubenswrapper[4753]: I1005 20:17:00.542348 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zr5q8" podStartSLOduration=107.542325118 podStartE2EDuration="1m47.542325118s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:00.541502782 +0000 UTC m=+129.389831094" watchObservedRunningTime="2025-10-05 20:17:00.542325118 +0000 UTC m=+129.390653380" Oct 05 20:17:01 crc kubenswrapper[4753]: I1005 20:17:01.851967 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:17:01 crc kubenswrapper[4753]: I1005 20:17:01.851984 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:17:01 crc kubenswrapper[4753]: I1005 20:17:01.852103 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:17:01 crc kubenswrapper[4753]: I1005 20:17:01.852173 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:17:01 crc kubenswrapper[4753]: E1005 20:17:01.854895 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 05 20:17:01 crc kubenswrapper[4753]: E1005 20:17:01.855035 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 05 20:17:01 crc kubenswrapper[4753]: E1005 20:17:01.855191 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ktspr" podUID="f99b8ef3-70ed-42e4-9217-a300fcd562d9" Oct 05 20:17:01 crc kubenswrapper[4753]: E1005 20:17:01.855283 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 05 20:17:03 crc kubenswrapper[4753]: I1005 20:17:03.852060 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:17:03 crc kubenswrapper[4753]: I1005 20:17:03.852070 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:17:03 crc kubenswrapper[4753]: I1005 20:17:03.852210 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:17:03 crc kubenswrapper[4753]: I1005 20:17:03.856259 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:17:03 crc kubenswrapper[4753]: I1005 20:17:03.856890 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 05 20:17:03 crc kubenswrapper[4753]: I1005 20:17:03.857097 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 05 20:17:03 crc kubenswrapper[4753]: I1005 20:17:03.857289 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 05 20:17:03 crc kubenswrapper[4753]: I1005 20:17:03.857523 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 05 20:17:03 crc kubenswrapper[4753]: I1005 20:17:03.859528 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 05 20:17:03 crc kubenswrapper[4753]: I1005 20:17:03.859660 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.322379 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.356369 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9jfj"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.356885 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.357113 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-d8b6f"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.357863 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.358184 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-bnpxs"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.358511 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-bnpxs" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.358721 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.359013 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.360454 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9w7kh"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.360989 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.361172 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dxgld"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.361598 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.373239 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.376377 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.376437 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.388884 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.388951 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.389103 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.389120 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.389184 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.389281 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.389441 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.389574 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.389815 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.389925 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.396329 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.396487 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.396702 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.397000 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.397177 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.397286 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.397377 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.397597 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.397693 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.397734 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.397743 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.397693 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.397836 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.398481 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.399675 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.399764 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.399956 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.400271 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.402916 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-7klvp"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.406041 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.406122 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.414554 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.415926 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.416681 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.417012 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cmpqd"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.417499 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-66zm9"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.417898 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tpd8r"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.418315 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.418474 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-66zm9" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.418750 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.418325 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.419077 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.420351 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.420640 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.420840 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.421201 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.421510 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.423604 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.424093 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.428057 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.428650 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.429072 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.429122 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.429226 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.429233 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.429326 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.429332 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.429339 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.430988 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wsrwg"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.431555 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wsrwg" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.432103 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.432641 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.433762 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rr2d7"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.434029 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.435939 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.436007 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.436478 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q9dxv"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.436868 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q9dxv" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.438486 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-zbdx5"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.439300 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.440181 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.446396 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9w7kh"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.446659 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.453954 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.457964 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.459064 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-d8b6f"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.461161 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.461219 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.461338 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.461430 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.461605 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.461862 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.462295 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.462693 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.462781 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.462964 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.463062 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.463100 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.463182 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.463245 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.463069 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.463072 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.475956 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.476225 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.480456 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.480807 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.480871 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.480962 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.481045 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.481108 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.481198 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.481521 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.482102 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.482150 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.482220 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.482549 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.482575 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.482819 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.482825 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.483050 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.483106 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.483338 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.483344 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.483582 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.483590 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.483654 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.483711 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.483893 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.484093 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.483898 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.485847 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.486874 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.487278 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.487703 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.492970 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rwrj7"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.493061 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.493679 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.495260 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.495197 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dxgld"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.495502 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.493219 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f436b8-10e7-4be0-9fd5-3047c5fafa45-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gnr4l\" (UID: \"91f436b8-10e7-4be0-9fd5-3047c5fafa45\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496669 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f65814d-e7b3-425a-b7a5-86c7e062ad35-serving-cert\") pod \"authentication-operator-69f744f599-9w7kh\" (UID: \"6f65814d-e7b3-425a-b7a5-86c7e062ad35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496686 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f65814d-e7b3-425a-b7a5-86c7e062ad35-service-ca-bundle\") pod \"authentication-operator-69f744f599-9w7kh\" (UID: \"6f65814d-e7b3-425a-b7a5-86c7e062ad35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496708 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496752 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496767 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-oauth-config\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496791 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-config\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496809 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f436b8-10e7-4be0-9fd5-3047c5fafa45-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gnr4l\" (UID: \"91f436b8-10e7-4be0-9fd5-3047c5fafa45\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496827 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-image-import-ca\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496841 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bn8t\" (UniqueName: \"kubernetes.io/projected/6f65814d-e7b3-425a-b7a5-86c7e062ad35-kube-api-access-2bn8t\") pod \"authentication-operator-69f744f599-9w7kh\" (UID: \"6f65814d-e7b3-425a-b7a5-86c7e062ad35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496860 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxjt7\" (UniqueName: \"kubernetes.io/projected/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-kube-api-access-dxjt7\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496874 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-oauth-serving-cert\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496890 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-etcd-client\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496904 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f65814d-e7b3-425a-b7a5-86c7e062ad35-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9w7kh\" (UID: \"6f65814d-e7b3-425a-b7a5-86c7e062ad35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496919 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x9jfj\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496934 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5-serving-cert\") pod \"openshift-config-operator-7777fb866f-dxgld\" (UID: \"a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496948 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdxqg\" (UniqueName: \"kubernetes.io/projected/a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5-kube-api-access-cdxqg\") pod \"openshift-config-operator-7777fb866f-dxgld\" (UID: \"a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496963 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-audit-dir\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496979 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1b80304-1bf0-464a-b196-392d4c4a6e6b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8h7qr\" (UID: \"f1b80304-1bf0-464a-b196-392d4c4a6e6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.496997 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-node-pullsecrets\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497010 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497025 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-config\") pod \"controller-manager-879f6c89f-x9jfj\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497040 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk6j9\" (UniqueName: \"kubernetes.io/projected/06068c71-eeec-4220-8279-579f47741023-kube-api-access-bk6j9\") pod \"cluster-samples-operator-665b6dd947-66zm9\" (UID: \"06068c71-eeec-4220-8279-579f47741023\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-66zm9" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497056 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjn26\" (UniqueName: \"kubernetes.io/projected/f1b80304-1bf0-464a-b196-392d4c4a6e6b-kube-api-access-sjn26\") pod \"cluster-image-registry-operator-dc59b4c8b-8h7qr\" (UID: \"f1b80304-1bf0-464a-b196-392d4c4a6e6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497070 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-serving-cert\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497090 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxfn5\" (UniqueName: \"kubernetes.io/projected/5615672e-59bf-448b-88ba-75a02438a8ad-kube-api-access-cxfn5\") pod \"route-controller-manager-6576b87f9c-jcc7v\" (UID: \"5615672e-59bf-448b-88ba-75a02438a8ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497114 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkdfx\" (UniqueName: \"kubernetes.io/projected/0cae56c9-6ca6-49f3-97d2-14e8a6748315-kube-api-access-dkdfx\") pod \"downloads-7954f5f757-bnpxs\" (UID: \"0cae56c9-6ca6-49f3-97d2-14e8a6748315\") " pod="openshift-console/downloads-7954f5f757-bnpxs" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497129 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjg8c\" (UniqueName: \"kubernetes.io/projected/a2a48e1a-ef95-45ff-89b4-6779d95a2096-kube-api-access-cjg8c\") pod \"openshift-controller-manager-operator-756b6f6bc6-h278r\" (UID: \"a2a48e1a-ef95-45ff-89b4-6779d95a2096\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497490 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dxgld\" (UID: \"a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497513 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-service-ca\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497543 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/22db54ee-7d52-475e-a824-9e563b2920e8-images\") pod \"machine-api-operator-5694c8668f-d8b6f\" (UID: \"22db54ee-7d52-475e-a824-9e563b2920e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497559 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497580 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22db54ee-7d52-475e-a824-9e563b2920e8-config\") pod \"machine-api-operator-5694c8668f-d8b6f\" (UID: \"22db54ee-7d52-475e-a824-9e563b2920e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497598 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497616 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497632 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497649 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5615672e-59bf-448b-88ba-75a02438a8ad-client-ca\") pod \"route-controller-manager-6576b87f9c-jcc7v\" (UID: \"5615672e-59bf-448b-88ba-75a02438a8ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497665 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-audit-policies\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497679 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-audit-dir\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497696 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/22db54ee-7d52-475e-a824-9e563b2920e8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-d8b6f\" (UID: \"22db54ee-7d52-475e-a824-9e563b2920e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497713 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2a48e1a-ef95-45ff-89b4-6779d95a2096-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h278r\" (UID: \"a2a48e1a-ef95-45ff-89b4-6779d95a2096\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497728 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-trusted-ca-bundle\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497747 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1b80304-1bf0-464a-b196-392d4c4a6e6b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8h7qr\" (UID: \"f1b80304-1bf0-464a-b196-392d4c4a6e6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497763 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497782 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-audit\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497796 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497811 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/06068c71-eeec-4220-8279-579f47741023-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-66zm9\" (UID: \"06068c71-eeec-4220-8279-579f47741023\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-66zm9" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497830 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-etcd-serving-ca\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497845 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-encryption-config\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497861 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5615672e-59bf-448b-88ba-75a02438a8ad-config\") pod \"route-controller-manager-6576b87f9c-jcc7v\" (UID: \"5615672e-59bf-448b-88ba-75a02438a8ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497876 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1b80304-1bf0-464a-b196-392d4c4a6e6b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8h7qr\" (UID: \"f1b80304-1bf0-464a-b196-392d4c4a6e6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497891 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-config\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497908 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxxtp\" (UniqueName: \"kubernetes.io/projected/91f436b8-10e7-4be0-9fd5-3047c5fafa45-kube-api-access-xxxtp\") pod \"openshift-apiserver-operator-796bbdcf4f-gnr4l\" (UID: \"91f436b8-10e7-4be0-9fd5-3047c5fafa45\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497924 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2a48e1a-ef95-45ff-89b4-6779d95a2096-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h278r\" (UID: \"a2a48e1a-ef95-45ff-89b4-6779d95a2096\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497938 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f65814d-e7b3-425a-b7a5-86c7e062ad35-config\") pod \"authentication-operator-69f744f599-9w7kh\" (UID: \"6f65814d-e7b3-425a-b7a5-86c7e062ad35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497958 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e73e9c7-3af4-4b10-a331-7899608702b3-serving-cert\") pod \"controller-manager-879f6c89f-x9jfj\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.497973 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57gvx\" (UniqueName: \"kubernetes.io/projected/22db54ee-7d52-475e-a824-9e563b2920e8-kube-api-access-57gvx\") pod \"machine-api-operator-5694c8668f-d8b6f\" (UID: \"22db54ee-7d52-475e-a824-9e563b2920e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.498003 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvgnv\" (UniqueName: \"kubernetes.io/projected/eb329af5-99e8-42d9-b79e-4c9acd09204d-kube-api-access-rvgnv\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.498019 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.498037 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.498058 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-serving-cert\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.498072 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65qg2\" (UniqueName: \"kubernetes.io/projected/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-kube-api-access-65qg2\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.498088 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nspjp\" (UniqueName: \"kubernetes.io/projected/6e73e9c7-3af4-4b10-a331-7899608702b3-kube-api-access-nspjp\") pod \"controller-manager-879f6c89f-x9jfj\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.498102 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-client-ca\") pod \"controller-manager-879f6c89f-x9jfj\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.498117 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5615672e-59bf-448b-88ba-75a02438a8ad-serving-cert\") pod \"route-controller-manager-6576b87f9c-jcc7v\" (UID: \"5615672e-59bf-448b-88ba-75a02438a8ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.498910 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.500053 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.504058 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.505662 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.506782 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.507536 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.508965 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.507963 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.508047 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.512266 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.512528 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.513748 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h7rzp"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.513789 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.514079 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xjp5g"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.514304 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.514511 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjp5g" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.514611 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h7rzp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.523186 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.523441 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.528638 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.536598 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.541978 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9jfj"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.543279 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-nfxm6"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.545196 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.566408 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.569455 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.570737 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-nfxm6" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.572268 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.597357 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.598054 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.598662 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.599680 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.600224 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.601084 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.602238 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.603708 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.603949 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604123 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604541 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/22db54ee-7d52-475e-a824-9e563b2920e8-images\") pod \"machine-api-operator-5694c8668f-d8b6f\" (UID: \"22db54ee-7d52-475e-a824-9e563b2920e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604586 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dxgld\" (UID: \"a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604614 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-service-ca\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604645 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22db54ee-7d52-475e-a824-9e563b2920e8-config\") pod \"machine-api-operator-5694c8668f-d8b6f\" (UID: \"22db54ee-7d52-475e-a824-9e563b2920e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604670 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604693 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604719 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5615672e-59bf-448b-88ba-75a02438a8ad-client-ca\") pod \"route-controller-manager-6576b87f9c-jcc7v\" (UID: \"5615672e-59bf-448b-88ba-75a02438a8ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604741 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-audit-policies\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604767 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604790 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604815 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9a13243e-13d5-4a22-9417-8dfc3896f332-images\") pod \"machine-config-operator-74547568cd-h6ntx\" (UID: \"9a13243e-13d5-4a22-9417-8dfc3896f332\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604839 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b5d8bc9-c226-4476-a309-fb21a5f79af3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-59prk\" (UID: \"5b5d8bc9-c226-4476-a309-fb21a5f79af3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604865 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/22db54ee-7d52-475e-a824-9e563b2920e8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-d8b6f\" (UID: \"22db54ee-7d52-475e-a824-9e563b2920e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604888 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2a48e1a-ef95-45ff-89b4-6779d95a2096-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h278r\" (UID: \"a2a48e1a-ef95-45ff-89b4-6779d95a2096\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604908 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-trusted-ca-bundle\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604928 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-audit-dir\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604956 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1b80304-1bf0-464a-b196-392d4c4a6e6b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8h7qr\" (UID: \"f1b80304-1bf0-464a-b196-392d4c4a6e6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.604979 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605002 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef7125ff-9b89-4972-954b-61145623ecec-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cwjwm\" (UID: \"ef7125ff-9b89-4972-954b-61145623ecec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605029 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-audit\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605051 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605085 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/06068c71-eeec-4220-8279-579f47741023-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-66zm9\" (UID: \"06068c71-eeec-4220-8279-579f47741023\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-66zm9" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605109 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-etcd-serving-ca\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605133 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-encryption-config\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605173 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9a13243e-13d5-4a22-9417-8dfc3896f332-auth-proxy-config\") pod \"machine-config-operator-74547568cd-h6ntx\" (UID: \"9a13243e-13d5-4a22-9417-8dfc3896f332\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605190 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwrc6\" (UniqueName: \"kubernetes.io/projected/5b5d8bc9-c226-4476-a309-fb21a5f79af3-kube-api-access-xwrc6\") pod \"kube-storage-version-migrator-operator-b67b599dd-59prk\" (UID: \"5b5d8bc9-c226-4476-a309-fb21a5f79af3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605212 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5615672e-59bf-448b-88ba-75a02438a8ad-config\") pod \"route-controller-manager-6576b87f9c-jcc7v\" (UID: \"5615672e-59bf-448b-88ba-75a02438a8ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605227 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1b80304-1bf0-464a-b196-392d4c4a6e6b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8h7qr\" (UID: \"f1b80304-1bf0-464a-b196-392d4c4a6e6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605243 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef7125ff-9b89-4972-954b-61145623ecec-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cwjwm\" (UID: \"ef7125ff-9b89-4972-954b-61145623ecec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605263 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxxtp\" (UniqueName: \"kubernetes.io/projected/91f436b8-10e7-4be0-9fd5-3047c5fafa45-kube-api-access-xxxtp\") pod \"openshift-apiserver-operator-796bbdcf4f-gnr4l\" (UID: \"91f436b8-10e7-4be0-9fd5-3047c5fafa45\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605281 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2a48e1a-ef95-45ff-89b4-6779d95a2096-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h278r\" (UID: \"a2a48e1a-ef95-45ff-89b4-6779d95a2096\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605309 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f65814d-e7b3-425a-b7a5-86c7e062ad35-config\") pod \"authentication-operator-69f744f599-9w7kh\" (UID: \"6f65814d-e7b3-425a-b7a5-86c7e062ad35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605325 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-config\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605342 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a0f0082-3cf0-4339-9964-02916c137f45-config\") pod \"machine-approver-56656f9798-9h4dc\" (UID: \"3a0f0082-3cf0-4339-9964-02916c137f45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605365 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a13243e-13d5-4a22-9417-8dfc3896f332-proxy-tls\") pod \"machine-config-operator-74547568cd-h6ntx\" (UID: \"9a13243e-13d5-4a22-9417-8dfc3896f332\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605391 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e73e9c7-3af4-4b10-a331-7899608702b3-serving-cert\") pod \"controller-manager-879f6c89f-x9jfj\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605416 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd241b3b-a68f-487b-bcd8-ca61782d9e4f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zpdnw\" (UID: \"cd241b3b-a68f-487b-bcd8-ca61782d9e4f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605445 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57gvx\" (UniqueName: \"kubernetes.io/projected/22db54ee-7d52-475e-a824-9e563b2920e8-kube-api-access-57gvx\") pod \"machine-api-operator-5694c8668f-d8b6f\" (UID: \"22db54ee-7d52-475e-a824-9e563b2920e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605470 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp4z7\" (UniqueName: \"kubernetes.io/projected/3a0f0082-3cf0-4339-9964-02916c137f45-kube-api-access-xp4z7\") pod \"machine-approver-56656f9798-9h4dc\" (UID: \"3a0f0082-3cf0-4339-9964-02916c137f45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605497 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-serving-cert\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605522 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvgnv\" (UniqueName: \"kubernetes.io/projected/eb329af5-99e8-42d9-b79e-4c9acd09204d-kube-api-access-rvgnv\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605547 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605568 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605588 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5-metrics-tls\") pod \"ingress-operator-5b745b69d9-lkdpm\" (UID: \"b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605612 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5-trusted-ca\") pod \"ingress-operator-5b745b69d9-lkdpm\" (UID: \"b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605634 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65qg2\" (UniqueName: \"kubernetes.io/projected/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-kube-api-access-65qg2\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605651 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nspjp\" (UniqueName: \"kubernetes.io/projected/6e73e9c7-3af4-4b10-a331-7899608702b3-kube-api-access-nspjp\") pod \"controller-manager-879f6c89f-x9jfj\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605667 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b5d8bc9-c226-4476-a309-fb21a5f79af3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-59prk\" (UID: \"5b5d8bc9-c226-4476-a309-fb21a5f79af3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605690 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3a0f0082-3cf0-4339-9964-02916c137f45-auth-proxy-config\") pod \"machine-approver-56656f9798-9h4dc\" (UID: \"3a0f0082-3cf0-4339-9964-02916c137f45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605717 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-client-ca\") pod \"controller-manager-879f6c89f-x9jfj\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605734 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5615672e-59bf-448b-88ba-75a02438a8ad-serving-cert\") pod \"route-controller-manager-6576b87f9c-jcc7v\" (UID: \"5615672e-59bf-448b-88ba-75a02438a8ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605753 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt6lr\" (UniqueName: \"kubernetes.io/projected/99413e5b-15a4-40f6-b7a5-2dbee5eafb1d-kube-api-access-qt6lr\") pod \"migrator-59844c95c7-xjp5g\" (UID: \"99413e5b-15a4-40f6-b7a5-2dbee5eafb1d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjp5g" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605786 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f436b8-10e7-4be0-9fd5-3047c5fafa45-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gnr4l\" (UID: \"91f436b8-10e7-4be0-9fd5-3047c5fafa45\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605812 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f65814d-e7b3-425a-b7a5-86c7e062ad35-serving-cert\") pod \"authentication-operator-69f744f599-9w7kh\" (UID: \"6f65814d-e7b3-425a-b7a5-86c7e062ad35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605834 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f65814d-e7b3-425a-b7a5-86c7e062ad35-service-ca-bundle\") pod \"authentication-operator-69f744f599-9w7kh\" (UID: \"6f65814d-e7b3-425a-b7a5-86c7e062ad35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605852 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605880 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sdjg\" (UniqueName: \"kubernetes.io/projected/f647e6b6-7d7f-4c72-9506-af98598583fc-kube-api-access-8sdjg\") pod \"dns-operator-744455d44c-q9dxv\" (UID: \"f647e6b6-7d7f-4c72-9506-af98598583fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-q9dxv" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605928 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605950 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lkdpm\" (UID: \"b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605972 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ggbx\" (UniqueName: \"kubernetes.io/projected/b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5-kube-api-access-7ggbx\") pod \"ingress-operator-5b745b69d9-lkdpm\" (UID: \"b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.605997 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606008 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606023 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-config\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606047 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-oauth-config\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606075 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f436b8-10e7-4be0-9fd5-3047c5fafa45-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gnr4l\" (UID: \"91f436b8-10e7-4be0-9fd5-3047c5fafa45\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606100 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd241b3b-a68f-487b-bcd8-ca61782d9e4f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zpdnw\" (UID: \"cd241b3b-a68f-487b-bcd8-ca61782d9e4f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606122 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3a0f0082-3cf0-4339-9964-02916c137f45-machine-approver-tls\") pod \"machine-approver-56656f9798-9h4dc\" (UID: \"3a0f0082-3cf0-4339-9964-02916c137f45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606164 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-image-import-ca\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606188 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bn8t\" (UniqueName: \"kubernetes.io/projected/6f65814d-e7b3-425a-b7a5-86c7e062ad35-kube-api-access-2bn8t\") pod \"authentication-operator-69f744f599-9w7kh\" (UID: \"6f65814d-e7b3-425a-b7a5-86c7e062ad35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606219 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-etcd-client\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606242 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxjt7\" (UniqueName: \"kubernetes.io/projected/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-kube-api-access-dxjt7\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606264 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-oauth-serving-cert\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606288 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x9jfj\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606310 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5-serving-cert\") pod \"openshift-config-operator-7777fb866f-dxgld\" (UID: \"a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606346 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdxqg\" (UniqueName: \"kubernetes.io/projected/a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5-kube-api-access-cdxqg\") pod \"openshift-config-operator-7777fb866f-dxgld\" (UID: \"a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606363 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f65814d-e7b3-425a-b7a5-86c7e062ad35-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9w7kh\" (UID: \"6f65814d-e7b3-425a-b7a5-86c7e062ad35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606383 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1b80304-1bf0-464a-b196-392d4c4a6e6b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8h7qr\" (UID: \"f1b80304-1bf0-464a-b196-392d4c4a6e6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606400 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-node-pullsecrets\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606416 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606433 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-audit-dir\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606452 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qghh\" (UniqueName: \"kubernetes.io/projected/9a13243e-13d5-4a22-9417-8dfc3896f332-kube-api-access-9qghh\") pod \"machine-config-operator-74547568cd-h6ntx\" (UID: \"9a13243e-13d5-4a22-9417-8dfc3896f332\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606476 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef7125ff-9b89-4972-954b-61145623ecec-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cwjwm\" (UID: \"ef7125ff-9b89-4972-954b-61145623ecec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606498 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-config\") pod \"controller-manager-879f6c89f-x9jfj\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606517 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjn26\" (UniqueName: \"kubernetes.io/projected/f1b80304-1bf0-464a-b196-392d4c4a6e6b-kube-api-access-sjn26\") pod \"cluster-image-registry-operator-dc59b4c8b-8h7qr\" (UID: \"f1b80304-1bf0-464a-b196-392d4c4a6e6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606534 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-serving-cert\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606552 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk6j9\" (UniqueName: \"kubernetes.io/projected/06068c71-eeec-4220-8279-579f47741023-kube-api-access-bk6j9\") pod \"cluster-samples-operator-665b6dd947-66zm9\" (UID: \"06068c71-eeec-4220-8279-579f47741023\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-66zm9" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606568 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd241b3b-a68f-487b-bcd8-ca61782d9e4f-config\") pod \"kube-controller-manager-operator-78b949d7b-zpdnw\" (UID: \"cd241b3b-a68f-487b-bcd8-ca61782d9e4f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606587 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f647e6b6-7d7f-4c72-9506-af98598583fc-metrics-tls\") pod \"dns-operator-744455d44c-q9dxv\" (UID: \"f647e6b6-7d7f-4c72-9506-af98598583fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-q9dxv" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606614 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxfn5\" (UniqueName: \"kubernetes.io/projected/5615672e-59bf-448b-88ba-75a02438a8ad-kube-api-access-cxfn5\") pod \"route-controller-manager-6576b87f9c-jcc7v\" (UID: \"5615672e-59bf-448b-88ba-75a02438a8ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606625 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/22db54ee-7d52-475e-a824-9e563b2920e8-images\") pod \"machine-api-operator-5694c8668f-d8b6f\" (UID: \"22db54ee-7d52-475e-a824-9e563b2920e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606632 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkdfx\" (UniqueName: \"kubernetes.io/projected/0cae56c9-6ca6-49f3-97d2-14e8a6748315-kube-api-access-dkdfx\") pod \"downloads-7954f5f757-bnpxs\" (UID: \"0cae56c9-6ca6-49f3-97d2-14e8a6748315\") " pod="openshift-console/downloads-7954f5f757-bnpxs" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606665 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjg8c\" (UniqueName: \"kubernetes.io/projected/a2a48e1a-ef95-45ff-89b4-6779d95a2096-kube-api-access-cjg8c\") pod \"openshift-controller-manager-operator-756b6f6bc6-h278r\" (UID: \"a2a48e1a-ef95-45ff-89b4-6779d95a2096\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.606882 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dxgld\" (UID: \"a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.607418 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-audit\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.607886 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-etcd-serving-ca\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.609663 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22db54ee-7d52-475e-a824-9e563b2920e8-config\") pod \"machine-api-operator-5694c8668f-d8b6f\" (UID: \"22db54ee-7d52-475e-a824-9e563b2920e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.614619 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.615466 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.615811 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gx875"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.616456 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gx875" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.617513 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4pbr6"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.618239 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4pbr6" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.618621 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f65814d-e7b3-425a-b7a5-86c7e062ad35-service-ca-bundle\") pod \"authentication-operator-69f744f599-9w7kh\" (UID: \"6f65814d-e7b3-425a-b7a5-86c7e062ad35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.619031 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5615672e-59bf-448b-88ba-75a02438a8ad-client-ca\") pod \"route-controller-manager-6576b87f9c-jcc7v\" (UID: \"5615672e-59bf-448b-88ba-75a02438a8ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.619124 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f436b8-10e7-4be0-9fd5-3047c5fafa45-config\") pod \"openshift-apiserver-operator-796bbdcf4f-gnr4l\" (UID: \"91f436b8-10e7-4be0-9fd5-3047c5fafa45\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.619454 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-audit-policies\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.619500 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f65814d-e7b3-425a-b7a5-86c7e062ad35-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9w7kh\" (UID: \"6f65814d-e7b3-425a-b7a5-86c7e062ad35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.619526 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-node-pullsecrets\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.619693 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-audit-dir\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.619815 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-audit-dir\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.620113 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.620542 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-config\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.620935 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1b80304-1bf0-464a-b196-392d4c4a6e6b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8h7qr\" (UID: \"f1b80304-1bf0-464a-b196-392d4c4a6e6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.621348 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.621509 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-config\") pod \"controller-manager-879f6c89f-x9jfj\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.622104 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.623537 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-service-ca\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.623802 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-oauth-serving-cert\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.624152 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-trusted-ca-bundle\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.626507 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2a48e1a-ef95-45ff-89b4-6779d95a2096-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h278r\" (UID: \"a2a48e1a-ef95-45ff-89b4-6779d95a2096\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.626730 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-client-ca\") pod \"controller-manager-879f6c89f-x9jfj\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.626945 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.627367 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.627631 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.627867 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.630267 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f65814d-e7b3-425a-b7a5-86c7e062ad35-config\") pod \"authentication-operator-69f744f599-9w7kh\" (UID: \"6f65814d-e7b3-425a-b7a5-86c7e062ad35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.632928 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-config\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.633437 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5615672e-59bf-448b-88ba-75a02438a8ad-config\") pod \"route-controller-manager-6576b87f9c-jcc7v\" (UID: \"5615672e-59bf-448b-88ba-75a02438a8ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.633538 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.633585 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-image-import-ca\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.634653 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-serving-cert\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.635058 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.635426 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/06068c71-eeec-4220-8279-579f47741023-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-66zm9\" (UID: \"06068c71-eeec-4220-8279-579f47741023\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-66zm9" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.635563 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f436b8-10e7-4be0-9fd5-3047c5fafa45-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-gnr4l\" (UID: \"91f436b8-10e7-4be0-9fd5-3047c5fafa45\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.636710 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x9jfj\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.637022 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.637385 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-etcd-client\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.639532 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.640175 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e73e9c7-3af4-4b10-a331-7899608702b3-serving-cert\") pod \"controller-manager-879f6c89f-x9jfj\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.640220 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9wgjd"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.640864 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9wgjd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.641876 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/22db54ee-7d52-475e-a824-9e563b2920e8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-d8b6f\" (UID: \"22db54ee-7d52-475e-a824-9e563b2920e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.642227 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1b80304-1bf0-464a-b196-392d4c4a6e6b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8h7qr\" (UID: \"f1b80304-1bf0-464a-b196-392d4c4a6e6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.644464 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-bnpxs"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.651820 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-encryption-config\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.652518 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2a48e1a-ef95-45ff-89b4-6779d95a2096-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h278r\" (UID: \"a2a48e1a-ef95-45ff-89b4-6779d95a2096\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.652628 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f65814d-e7b3-425a-b7a5-86c7e062ad35-serving-cert\") pod \"authentication-operator-69f744f599-9w7kh\" (UID: \"6f65814d-e7b3-425a-b7a5-86c7e062ad35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.652935 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.653324 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5-serving-cert\") pod \"openshift-config-operator-7777fb866f-dxgld\" (UID: \"a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.653563 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.654275 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.654502 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.654655 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-66zm9"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.654971 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-oauth-config\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.655263 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7klvp"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.658461 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.658578 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.658696 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w2sqj"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.657828 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.657357 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5615672e-59bf-448b-88ba-75a02438a8ad-serving-cert\") pod \"route-controller-manager-6576b87f9c-jcc7v\" (UID: \"5615672e-59bf-448b-88ba-75a02438a8ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.660895 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.663438 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.663546 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cmpqd"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.663661 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wsrwg"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.663392 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.661038 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.664169 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.665521 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rr2d7"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.664565 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-serving-cert\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.666317 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-d78vr"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.667063 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-d78vr" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.667559 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-nfxm6"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.668392 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.669434 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.670575 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tpd8r"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.671965 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.673219 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q9dxv"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.675193 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.676677 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w2sqj"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.678363 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.678507 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9wgjd"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.680330 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.681685 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.684925 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4pbr6"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.688278 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.691362 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.693978 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rwrj7"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.696747 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.698116 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.698406 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.699366 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h7rzp"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.700590 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.701736 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.702862 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-459zr"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.703802 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-459zr" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.703902 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xjp5g"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.704930 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gx875"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.706004 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-459zr"] Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707212 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5-metrics-tls\") pod \"ingress-operator-5b745b69d9-lkdpm\" (UID: \"b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707244 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5-trusted-ca\") pod \"ingress-operator-5b745b69d9-lkdpm\" (UID: \"b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707270 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b5d8bc9-c226-4476-a309-fb21a5f79af3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-59prk\" (UID: \"5b5d8bc9-c226-4476-a309-fb21a5f79af3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707288 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3a0f0082-3cf0-4339-9964-02916c137f45-auth-proxy-config\") pod \"machine-approver-56656f9798-9h4dc\" (UID: \"3a0f0082-3cf0-4339-9964-02916c137f45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707306 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt6lr\" (UniqueName: \"kubernetes.io/projected/99413e5b-15a4-40f6-b7a5-2dbee5eafb1d-kube-api-access-qt6lr\") pod \"migrator-59844c95c7-xjp5g\" (UID: \"99413e5b-15a4-40f6-b7a5-2dbee5eafb1d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjp5g" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707321 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sdjg\" (UniqueName: \"kubernetes.io/projected/f647e6b6-7d7f-4c72-9506-af98598583fc-kube-api-access-8sdjg\") pod \"dns-operator-744455d44c-q9dxv\" (UID: \"f647e6b6-7d7f-4c72-9506-af98598583fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-q9dxv" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707345 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lkdpm\" (UID: \"b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707379 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ggbx\" (UniqueName: \"kubernetes.io/projected/b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5-kube-api-access-7ggbx\") pod \"ingress-operator-5b745b69d9-lkdpm\" (UID: \"b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707412 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd241b3b-a68f-487b-bcd8-ca61782d9e4f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zpdnw\" (UID: \"cd241b3b-a68f-487b-bcd8-ca61782d9e4f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707431 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3a0f0082-3cf0-4339-9964-02916c137f45-machine-approver-tls\") pod \"machine-approver-56656f9798-9h4dc\" (UID: \"3a0f0082-3cf0-4339-9964-02916c137f45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707470 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qghh\" (UniqueName: \"kubernetes.io/projected/9a13243e-13d5-4a22-9417-8dfc3896f332-kube-api-access-9qghh\") pod \"machine-config-operator-74547568cd-h6ntx\" (UID: \"9a13243e-13d5-4a22-9417-8dfc3896f332\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707491 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef7125ff-9b89-4972-954b-61145623ecec-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cwjwm\" (UID: \"ef7125ff-9b89-4972-954b-61145623ecec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707515 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd241b3b-a68f-487b-bcd8-ca61782d9e4f-config\") pod \"kube-controller-manager-operator-78b949d7b-zpdnw\" (UID: \"cd241b3b-a68f-487b-bcd8-ca61782d9e4f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707532 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f647e6b6-7d7f-4c72-9506-af98598583fc-metrics-tls\") pod \"dns-operator-744455d44c-q9dxv\" (UID: \"f647e6b6-7d7f-4c72-9506-af98598583fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-q9dxv" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707587 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9a13243e-13d5-4a22-9417-8dfc3896f332-images\") pod \"machine-config-operator-74547568cd-h6ntx\" (UID: \"9a13243e-13d5-4a22-9417-8dfc3896f332\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707604 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b5d8bc9-c226-4476-a309-fb21a5f79af3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-59prk\" (UID: \"5b5d8bc9-c226-4476-a309-fb21a5f79af3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707623 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef7125ff-9b89-4972-954b-61145623ecec-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cwjwm\" (UID: \"ef7125ff-9b89-4972-954b-61145623ecec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707648 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9a13243e-13d5-4a22-9417-8dfc3896f332-auth-proxy-config\") pod \"machine-config-operator-74547568cd-h6ntx\" (UID: \"9a13243e-13d5-4a22-9417-8dfc3896f332\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707664 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwrc6\" (UniqueName: \"kubernetes.io/projected/5b5d8bc9-c226-4476-a309-fb21a5f79af3-kube-api-access-xwrc6\") pod \"kube-storage-version-migrator-operator-b67b599dd-59prk\" (UID: \"5b5d8bc9-c226-4476-a309-fb21a5f79af3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707680 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef7125ff-9b89-4972-954b-61145623ecec-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cwjwm\" (UID: \"ef7125ff-9b89-4972-954b-61145623ecec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707900 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3a0f0082-3cf0-4339-9964-02916c137f45-auth-proxy-config\") pod \"machine-approver-56656f9798-9h4dc\" (UID: \"3a0f0082-3cf0-4339-9964-02916c137f45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.707992 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a0f0082-3cf0-4339-9964-02916c137f45-config\") pod \"machine-approver-56656f9798-9h4dc\" (UID: \"3a0f0082-3cf0-4339-9964-02916c137f45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.708026 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a13243e-13d5-4a22-9417-8dfc3896f332-proxy-tls\") pod \"machine-config-operator-74547568cd-h6ntx\" (UID: \"9a13243e-13d5-4a22-9417-8dfc3896f332\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.708046 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd241b3b-a68f-487b-bcd8-ca61782d9e4f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zpdnw\" (UID: \"cd241b3b-a68f-487b-bcd8-ca61782d9e4f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.708073 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp4z7\" (UniqueName: \"kubernetes.io/projected/3a0f0082-3cf0-4339-9964-02916c137f45-kube-api-access-xp4z7\") pod \"machine-approver-56656f9798-9h4dc\" (UID: \"3a0f0082-3cf0-4339-9964-02916c137f45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.708689 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9a13243e-13d5-4a22-9417-8dfc3896f332-auth-proxy-config\") pod \"machine-config-operator-74547568cd-h6ntx\" (UID: \"9a13243e-13d5-4a22-9417-8dfc3896f332\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.708718 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a0f0082-3cf0-4339-9964-02916c137f45-config\") pod \"machine-approver-56656f9798-9h4dc\" (UID: \"3a0f0082-3cf0-4339-9964-02916c137f45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.710245 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3a0f0082-3cf0-4339-9964-02916c137f45-machine-approver-tls\") pod \"machine-approver-56656f9798-9h4dc\" (UID: \"3a0f0082-3cf0-4339-9964-02916c137f45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.712157 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f647e6b6-7d7f-4c72-9506-af98598583fc-metrics-tls\") pod \"dns-operator-744455d44c-q9dxv\" (UID: \"f647e6b6-7d7f-4c72-9506-af98598583fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-q9dxv" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.719034 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.739055 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.758620 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.778693 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.798690 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.819294 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.839709 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.878236 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.899247 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.919376 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.938936 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.959652 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.978778 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 05 20:17:10 crc kubenswrapper[4753]: I1005 20:17:10.998388 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.019369 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.038872 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.063243 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.069182 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5-trusted-ca\") pod \"ingress-operator-5b745b69d9-lkdpm\" (UID: \"b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.078858 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.092110 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5-metrics-tls\") pod \"ingress-operator-5b745b69d9-lkdpm\" (UID: \"b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.098809 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.118208 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.139029 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.159005 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.178572 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.198985 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.214623 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd241b3b-a68f-487b-bcd8-ca61782d9e4f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zpdnw\" (UID: \"cd241b3b-a68f-487b-bcd8-ca61782d9e4f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.218934 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.229088 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd241b3b-a68f-487b-bcd8-ca61782d9e4f-config\") pod \"kube-controller-manager-operator-78b949d7b-zpdnw\" (UID: \"cd241b3b-a68f-487b-bcd8-ca61782d9e4f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.239164 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.256083 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef7125ff-9b89-4972-954b-61145623ecec-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cwjwm\" (UID: \"ef7125ff-9b89-4972-954b-61145623ecec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.259102 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.279180 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.279870 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef7125ff-9b89-4972-954b-61145623ecec-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cwjwm\" (UID: \"ef7125ff-9b89-4972-954b-61145623ecec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.299072 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.308658 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9a13243e-13d5-4a22-9417-8dfc3896f332-images\") pod \"machine-config-operator-74547568cd-h6ntx\" (UID: \"9a13243e-13d5-4a22-9417-8dfc3896f332\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.319497 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.339783 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.351918 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a13243e-13d5-4a22-9417-8dfc3896f332-proxy-tls\") pod \"machine-config-operator-74547568cd-h6ntx\" (UID: \"9a13243e-13d5-4a22-9417-8dfc3896f332\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.359047 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.379007 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.398051 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.424827 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.439417 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.458829 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.478927 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.491819 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b5d8bc9-c226-4476-a309-fb21a5f79af3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-59prk\" (UID: \"5b5d8bc9-c226-4476-a309-fb21a5f79af3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.499524 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.519305 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.529573 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b5d8bc9-c226-4476-a309-fb21a5f79af3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-59prk\" (UID: \"5b5d8bc9-c226-4476-a309-fb21a5f79af3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.537848 4753 request.go:700] Waited for 1.000763476s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.539765 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.579354 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.599595 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.619301 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.639990 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.659561 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.680611 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.700336 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.719328 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.739763 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.780256 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.789130 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjg8c\" (UniqueName: \"kubernetes.io/projected/a2a48e1a-ef95-45ff-89b4-6779d95a2096-kube-api-access-cjg8c\") pod \"openshift-controller-manager-operator-756b6f6bc6-h278r\" (UID: \"a2a48e1a-ef95-45ff-89b4-6779d95a2096\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.799929 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.846591 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjn26\" (UniqueName: \"kubernetes.io/projected/f1b80304-1bf0-464a-b196-392d4c4a6e6b-kube-api-access-sjn26\") pod \"cluster-image-registry-operator-dc59b4c8b-8h7qr\" (UID: \"f1b80304-1bf0-464a-b196-392d4c4a6e6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.870105 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57gvx\" (UniqueName: \"kubernetes.io/projected/22db54ee-7d52-475e-a824-9e563b2920e8-kube-api-access-57gvx\") pod \"machine-api-operator-5694c8668f-d8b6f\" (UID: \"22db54ee-7d52-475e-a824-9e563b2920e8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.893678 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvgnv\" (UniqueName: \"kubernetes.io/projected/eb329af5-99e8-42d9-b79e-4c9acd09204d-kube-api-access-rvgnv\") pod \"console-f9d7485db-7klvp\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.903704 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.908296 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxjt7\" (UniqueName: \"kubernetes.io/projected/ad58fa0b-b61d-4afa-bf14-f82b2b976bdd-kube-api-access-dxjt7\") pod \"apiserver-76f77b778f-cmpqd\" (UID: \"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd\") " pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.928571 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdxqg\" (UniqueName: \"kubernetes.io/projected/a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5-kube-api-access-cdxqg\") pod \"openshift-config-operator-7777fb866f-dxgld\" (UID: \"a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.939704 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk6j9\" (UniqueName: \"kubernetes.io/projected/06068c71-eeec-4220-8279-579f47741023-kube-api-access-bk6j9\") pod \"cluster-samples-operator-665b6dd947-66zm9\" (UID: \"06068c71-eeec-4220-8279-579f47741023\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-66zm9" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.969677 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkdfx\" (UniqueName: \"kubernetes.io/projected/0cae56c9-6ca6-49f3-97d2-14e8a6748315-kube-api-access-dkdfx\") pod \"downloads-7954f5f757-bnpxs\" (UID: \"0cae56c9-6ca6-49f3-97d2-14e8a6748315\") " pod="openshift-console/downloads-7954f5f757-bnpxs" Oct 05 20:17:11 crc kubenswrapper[4753]: I1005 20:17:11.982992 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxfn5\" (UniqueName: \"kubernetes.io/projected/5615672e-59bf-448b-88ba-75a02438a8ad-kube-api-access-cxfn5\") pod \"route-controller-manager-6576b87f9c-jcc7v\" (UID: \"5615672e-59bf-448b-88ba-75a02438a8ad\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.007976 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.009342 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bn8t\" (UniqueName: \"kubernetes.io/projected/6f65814d-e7b3-425a-b7a5-86c7e062ad35-kube-api-access-2bn8t\") pod \"authentication-operator-69f744f599-9w7kh\" (UID: \"6f65814d-e7b3-425a-b7a5-86c7e062ad35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.034007 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nspjp\" (UniqueName: \"kubernetes.io/projected/6e73e9c7-3af4-4b10-a331-7899608702b3-kube-api-access-nspjp\") pod \"controller-manager-879f6c89f-x9jfj\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.040701 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65qg2\" (UniqueName: \"kubernetes.io/projected/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-kube-api-access-65qg2\") pod \"oauth-openshift-558db77b4-tpd8r\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.049704 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.054888 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1b80304-1bf0-464a-b196-392d4c4a6e6b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8h7qr\" (UID: \"f1b80304-1bf0-464a-b196-392d4c4a6e6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.060339 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.063359 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-66zm9" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.069635 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.073028 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxxtp\" (UniqueName: \"kubernetes.io/projected/91f436b8-10e7-4be0-9fd5-3047c5fafa45-kube-api-access-xxxtp\") pod \"openshift-apiserver-operator-796bbdcf4f-gnr4l\" (UID: \"91f436b8-10e7-4be0-9fd5-3047c5fafa45\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.078950 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.098683 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.100427 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.120042 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.140162 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.166496 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.166773 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.168182 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-d8b6f"] Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.179753 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.189770 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.202761 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.215929 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-bnpxs" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.218492 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.229763 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.239133 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.259603 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.283981 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.286709 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.303865 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.319054 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.320512 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.331117 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dxgld"] Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.340188 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.359028 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.367869 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7klvp"] Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.389878 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.395988 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cmpqd"] Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.399913 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 05 20:17:12 crc kubenswrapper[4753]: W1005 20:17:12.402848 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb329af5_99e8_42d9_b79e_4c9acd09204d.slice/crio-82648570a585e6023562d0debdcb0b25fe43c56a25c25e260f29d09543f0a6c0 WatchSource:0}: Error finding container 82648570a585e6023562d0debdcb0b25fe43c56a25c25e260f29d09543f0a6c0: Status 404 returned error can't find the container with id 82648570a585e6023562d0debdcb0b25fe43c56a25c25e260f29d09543f0a6c0 Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.420813 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.440031 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.460845 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r"] Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.463334 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.483836 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.504992 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.507292 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-66zm9"] Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.520946 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.538666 4753 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.557670 4753 request.go:700] Waited for 1.892068223s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.559783 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.579429 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.581929 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7klvp" event={"ID":"eb329af5-99e8-42d9-b79e-4c9acd09204d","Type":"ContainerStarted","Data":"82648570a585e6023562d0debdcb0b25fe43c56a25c25e260f29d09543f0a6c0"} Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.583498 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" event={"ID":"a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5","Type":"ContainerStarted","Data":"cfc861c9de152d11a2dbd634beeb73247884ac42f56e7ee6a9dd531fe7461efc"} Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.583541 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" event={"ID":"a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5","Type":"ContainerStarted","Data":"4fc6c2e2111aa5c780aa4f8ec273c8ae1a2e6a7675f92eac41224537a5322870"} Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.587571 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" event={"ID":"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd","Type":"ContainerStarted","Data":"5a8e70020fbc0c9a61ab0af15d1c226145a8778c7dd31923712091bca3eb99f6"} Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.589293 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r" event={"ID":"a2a48e1a-ef95-45ff-89b4-6779d95a2096","Type":"ContainerStarted","Data":"5e77c267ac83f18fccad49ef05d8305b55e8db6435ca63c7dce79be1ee994989"} Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.592848 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" event={"ID":"22db54ee-7d52-475e-a824-9e563b2920e8","Type":"ContainerStarted","Data":"6ac03d9b61271669b4d5f62d96ecc7848958a53dfffce0687ed097d9a2a7b227"} Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.592884 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" event={"ID":"22db54ee-7d52-475e-a824-9e563b2920e8","Type":"ContainerStarted","Data":"1421c0338dd7cc974d6b5968e5ee484070554e42a3e83ab11c44b42bbe19bef5"} Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.599616 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.619237 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.644424 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.658346 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.678592 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.718612 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tpd8r"] Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.728237 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt6lr\" (UniqueName: \"kubernetes.io/projected/99413e5b-15a4-40f6-b7a5-2dbee5eafb1d-kube-api-access-qt6lr\") pod \"migrator-59844c95c7-xjp5g\" (UID: \"99413e5b-15a4-40f6-b7a5-2dbee5eafb1d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjp5g" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.733041 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lkdpm\" (UID: \"b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" Oct 05 20:17:12 crc kubenswrapper[4753]: E1005 20:17:12.751241 4753 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda414d9b0_4b87_4ecc_ae67_f9f38c33a3f5.slice/crio-cfc861c9de152d11a2dbd634beeb73247884ac42f56e7ee6a9dd531fe7461efc.scope\": RecentStats: unable to find data in memory cache]" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.759037 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sdjg\" (UniqueName: \"kubernetes.io/projected/f647e6b6-7d7f-4c72-9506-af98598583fc-kube-api-access-8sdjg\") pod \"dns-operator-744455d44c-q9dxv\" (UID: \"f647e6b6-7d7f-4c72-9506-af98598583fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-q9dxv" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.776672 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd241b3b-a68f-487b-bcd8-ca61782d9e4f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zpdnw\" (UID: \"cd241b3b-a68f-487b-bcd8-ca61782d9e4f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.794605 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-bnpxs"] Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.794747 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9w7kh"] Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.804545 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr"] Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.805959 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q9dxv" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.807334 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ggbx\" (UniqueName: \"kubernetes.io/projected/b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5-kube-api-access-7ggbx\") pod \"ingress-operator-5b745b69d9-lkdpm\" (UID: \"b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.821941 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qghh\" (UniqueName: \"kubernetes.io/projected/9a13243e-13d5-4a22-9417-8dfc3896f332-kube-api-access-9qghh\") pod \"machine-config-operator-74547568cd-h6ntx\" (UID: \"9a13243e-13d5-4a22-9417-8dfc3896f332\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.824320 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9jfj"] Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.832307 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" Oct 05 20:17:12 crc kubenswrapper[4753]: W1005 20:17:12.836457 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f65814d_e7b3_425a_b7a5_86c7e062ad35.slice/crio-7a65ba5030aa0155b021f4c96c390eade5af16ccf684d03be84ac6e78cce35ed WatchSource:0}: Error finding container 7a65ba5030aa0155b021f4c96c390eade5af16ccf684d03be84ac6e78cce35ed: Status 404 returned error can't find the container with id 7a65ba5030aa0155b021f4c96c390eade5af16ccf684d03be84ac6e78cce35ed Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.836875 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef7125ff-9b89-4972-954b-61145623ecec-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cwjwm\" (UID: \"ef7125ff-9b89-4972-954b-61145623ecec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.838028 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.844884 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.852831 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.856237 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwrc6\" (UniqueName: \"kubernetes.io/projected/5b5d8bc9-c226-4476-a309-fb21a5f79af3-kube-api-access-xwrc6\") pod \"kube-storage-version-migrator-operator-b67b599dd-59prk\" (UID: \"5b5d8bc9-c226-4476-a309-fb21a5f79af3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.859486 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjp5g" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.873989 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.874960 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp4z7\" (UniqueName: \"kubernetes.io/projected/3a0f0082-3cf0-4339-9964-02916c137f45-kube-api-access-xp4z7\") pod \"machine-approver-56656f9798-9h4dc\" (UID: \"3a0f0082-3cf0-4339-9964-02916c137f45\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.906755 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v"] Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.920655 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l"] Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.954618 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-bound-sa-token\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955006 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a19f7732-a06f-4b93-b549-4c0a4ccd23fa-trusted-ca\") pod \"console-operator-58897d9998-wsrwg\" (UID: \"a19f7732-a06f-4b93-b549-4c0a4ccd23fa\") " pod="openshift-console-operator/console-operator-58897d9998-wsrwg" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955029 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8855a1b-3e62-4ed3-acb0-eb0663d8df01-metrics-certs\") pod \"router-default-5444994796-zbdx5\" (UID: \"e8855a1b-3e62-4ed3-acb0-eb0663d8df01\") " pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955049 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-registry-certificates\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955069 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86e1bede-355f-425d-b6c8-b300f7addf32-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t6jr9\" (UID: \"86e1bede-355f-425d-b6c8-b300f7addf32\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955088 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a19f7732-a06f-4b93-b549-4c0a4ccd23fa-config\") pod \"console-operator-58897d9998-wsrwg\" (UID: \"a19f7732-a06f-4b93-b549-4c0a4ccd23fa\") " pod="openshift-console-operator/console-operator-58897d9998-wsrwg" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955106 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86e1bede-355f-425d-b6c8-b300f7addf32-config\") pod \"kube-apiserver-operator-766d6c64bb-t6jr9\" (UID: \"86e1bede-355f-425d-b6c8-b300f7addf32\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955172 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955212 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8855a1b-3e62-4ed3-acb0-eb0663d8df01-service-ca-bundle\") pod \"router-default-5444994796-zbdx5\" (UID: \"e8855a1b-3e62-4ed3-acb0-eb0663d8df01\") " pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955231 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6w7g\" (UniqueName: \"kubernetes.io/projected/e8855a1b-3e62-4ed3-acb0-eb0663d8df01-kube-api-access-f6w7g\") pod \"router-default-5444994796-zbdx5\" (UID: \"e8855a1b-3e62-4ed3-acb0-eb0663d8df01\") " pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955250 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e8855a1b-3e62-4ed3-acb0-eb0663d8df01-stats-auth\") pod \"router-default-5444994796-zbdx5\" (UID: \"e8855a1b-3e62-4ed3-acb0-eb0663d8df01\") " pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955289 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955315 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e8855a1b-3e62-4ed3-acb0-eb0663d8df01-default-certificate\") pod \"router-default-5444994796-zbdx5\" (UID: \"e8855a1b-3e62-4ed3-acb0-eb0663d8df01\") " pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955337 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86e1bede-355f-425d-b6c8-b300f7addf32-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t6jr9\" (UID: \"86e1bede-355f-425d-b6c8-b300f7addf32\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955354 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/322f0415-b81e-4f7a-adec-e49a0460844b-serving-cert\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955388 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-trusted-ca\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955415 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/322f0415-b81e-4f7a-adec-e49a0460844b-audit-policies\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955432 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7da48090-042e-4fef-afdf-9e6e54a89fe2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h7rzp\" (UID: \"7da48090-042e-4fef-afdf-9e6e54a89fe2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h7rzp" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955451 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-registry-tls\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955470 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322f0415-b81e-4f7a-adec-e49a0460844b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955491 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-config\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955507 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/322f0415-b81e-4f7a-adec-e49a0460844b-audit-dir\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955526 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk29h\" (UniqueName: \"kubernetes.io/projected/a19f7732-a06f-4b93-b549-4c0a4ccd23fa-kube-api-access-sk29h\") pod \"console-operator-58897d9998-wsrwg\" (UID: \"a19f7732-a06f-4b93-b549-4c0a4ccd23fa\") " pod="openshift-console-operator/console-operator-58897d9998-wsrwg" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955557 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-etcd-client\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955580 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-etcd-ca\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955602 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/322f0415-b81e-4f7a-adec-e49a0460844b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955634 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwrbx\" (UniqueName: \"kubernetes.io/projected/322f0415-b81e-4f7a-adec-e49a0460844b-kube-api-access-xwrbx\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955651 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a19f7732-a06f-4b93-b549-4c0a4ccd23fa-serving-cert\") pod \"console-operator-58897d9998-wsrwg\" (UID: \"a19f7732-a06f-4b93-b549-4c0a4ccd23fa\") " pod="openshift-console-operator/console-operator-58897d9998-wsrwg" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955682 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/322f0415-b81e-4f7a-adec-e49a0460844b-etcd-client\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955699 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/322f0415-b81e-4f7a-adec-e49a0460844b-encryption-config\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955716 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcdlz\" (UniqueName: \"kubernetes.io/projected/7da48090-042e-4fef-afdf-9e6e54a89fe2-kube-api-access-qcdlz\") pod \"control-plane-machine-set-operator-78cbb6b69f-h7rzp\" (UID: \"7da48090-042e-4fef-afdf-9e6e54a89fe2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h7rzp" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955735 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-etcd-service-ca\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955777 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955796 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2hd6\" (UniqueName: \"kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-kube-api-access-p2hd6\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955821 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvhg2\" (UniqueName: \"kubernetes.io/projected/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-kube-api-access-gvhg2\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:12 crc kubenswrapper[4753]: I1005 20:17:12.955847 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-serving-cert\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:12 crc kubenswrapper[4753]: E1005 20:17:12.959469 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:13.459453298 +0000 UTC m=+142.307781540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:12 crc kubenswrapper[4753]: W1005 20:17:12.997506 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91f436b8_10e7_4be0_9fd5_3047c5fafa45.slice/crio-9779b336508ca26fd2de4e2b6906da3f7d2ab5181367b324fb66f63f218489dd WatchSource:0}: Error finding container 9779b336508ca26fd2de4e2b6906da3f7d2ab5181367b324fb66f63f218489dd: Status 404 returned error can't find the container with id 9779b336508ca26fd2de4e2b6906da3f7d2ab5181367b324fb66f63f218489dd Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.059691 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.059860 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/322f0415-b81e-4f7a-adec-e49a0460844b-serving-cert\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.059890 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10c622e2-78c9-42bd-9031-776cded4435c-secret-volume\") pod \"collect-profiles-29328255-xfjfm\" (UID: \"10c622e2-78c9-42bd-9031-776cded4435c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" Oct 05 20:17:13 crc kubenswrapper[4753]: E1005 20:17:13.059924 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:13.559899153 +0000 UTC m=+142.408227385 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.059957 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b92548e-588c-43d5-a99e-9fe9558c8526-metrics-tls\") pod \"dns-default-459zr\" (UID: \"2b92548e-588c-43d5-a99e-9fe9558c8526\") " pod="openshift-dns/dns-default-459zr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.059987 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/170ae05d-7931-4f6b-a1b1-d4cf1c6709ba-cert\") pod \"ingress-canary-9wgjd\" (UID: \"170ae05d-7931-4f6b-a1b1-d4cf1c6709ba\") " pod="openshift-ingress-canary/ingress-canary-9wgjd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.060012 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-trusted-ca\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.060032 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/322f0415-b81e-4f7a-adec-e49a0460844b-audit-policies\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.060050 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7da48090-042e-4fef-afdf-9e6e54a89fe2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h7rzp\" (UID: \"7da48090-042e-4fef-afdf-9e6e54a89fe2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h7rzp" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.060068 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn4mc\" (UniqueName: \"kubernetes.io/projected/170ae05d-7931-4f6b-a1b1-d4cf1c6709ba-kube-api-access-xn4mc\") pod \"ingress-canary-9wgjd\" (UID: \"170ae05d-7931-4f6b-a1b1-d4cf1c6709ba\") " pod="openshift-ingress-canary/ingress-canary-9wgjd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.060087 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-registry-tls\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.060150 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322f0415-b81e-4f7a-adec-e49a0460844b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.060415 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f9cb473-0c3f-4b38-96ab-5d6243a7aa85-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vvtsf\" (UID: \"8f9cb473-0c3f-4b38-96ab-5d6243a7aa85\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.060446 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-config\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.060463 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/322f0415-b81e-4f7a-adec-e49a0460844b-audit-dir\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.065657 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7da48090-042e-4fef-afdf-9e6e54a89fe2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h7rzp\" (UID: \"7da48090-042e-4fef-afdf-9e6e54a89fe2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h7rzp" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.066351 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/322f0415-b81e-4f7a-adec-e49a0460844b-audit-dir\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.066907 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/322f0415-b81e-4f7a-adec-e49a0460844b-audit-policies\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.066918 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322f0415-b81e-4f7a-adec-e49a0460844b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.067507 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-config\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.071389 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-registry-tls\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.071473 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-trusted-ca\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.075823 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk29h\" (UniqueName: \"kubernetes.io/projected/a19f7732-a06f-4b93-b549-4c0a4ccd23fa-kube-api-access-sk29h\") pod \"console-operator-58897d9998-wsrwg\" (UID: \"a19f7732-a06f-4b93-b549-4c0a4ccd23fa\") " pod="openshift-console-operator/console-operator-58897d9998-wsrwg" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.081076 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q9dxv"] Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.081852 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f3b053c-4395-46db-9d64-094d422f8145-serving-cert\") pod \"service-ca-operator-777779d784-n7kdk\" (UID: \"1f3b053c-4395-46db-9d64-094d422f8145\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.081885 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f3b053c-4395-46db-9d64-094d422f8145-config\") pod \"service-ca-operator-777779d784-n7kdk\" (UID: \"1f3b053c-4395-46db-9d64-094d422f8145\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.081922 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-etcd-client\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.081943 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-etcd-ca\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.081962 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/322f0415-b81e-4f7a-adec-e49a0460844b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082008 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbw7r\" (UniqueName: \"kubernetes.io/projected/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-kube-api-access-wbw7r\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082057 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwrbx\" (UniqueName: \"kubernetes.io/projected/322f0415-b81e-4f7a-adec-e49a0460844b-kube-api-access-xwrbx\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082074 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a19f7732-a06f-4b93-b549-4c0a4ccd23fa-serving-cert\") pod \"console-operator-58897d9998-wsrwg\" (UID: \"a19f7732-a06f-4b93-b549-4c0a4ccd23fa\") " pod="openshift-console-operator/console-operator-58897d9998-wsrwg" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082099 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/322f0415-b81e-4f7a-adec-e49a0460844b-etcd-client\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082116 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/322f0415-b81e-4f7a-adec-e49a0460844b-encryption-config\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082132 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcdlz\" (UniqueName: \"kubernetes.io/projected/7da48090-042e-4fef-afdf-9e6e54a89fe2-kube-api-access-qcdlz\") pod \"control-plane-machine-set-operator-78cbb6b69f-h7rzp\" (UID: \"7da48090-042e-4fef-afdf-9e6e54a89fe2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h7rzp" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082186 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-etcd-service-ca\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082204 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d3383e66-50dd-4d8a-a24f-faa83ad90022-signing-key\") pod \"service-ca-9c57cc56f-4pbr6\" (UID: \"d3383e66-50dd-4d8a-a24f-faa83ad90022\") " pod="openshift-service-ca/service-ca-9c57cc56f-4pbr6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082221 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/74a4246e-6a4f-4e6b-9562-a3746d802003-tmpfs\") pod \"packageserver-d55dfcdfc-xkst4\" (UID: \"74a4246e-6a4f-4e6b-9562-a3746d802003\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082238 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082264 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2hd6\" (UniqueName: \"kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-kube-api-access-p2hd6\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082279 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvhg2\" (UniqueName: \"kubernetes.io/projected/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-kube-api-access-gvhg2\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082295 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d3383e66-50dd-4d8a-a24f-faa83ad90022-signing-cabundle\") pod \"service-ca-9c57cc56f-4pbr6\" (UID: \"d3383e66-50dd-4d8a-a24f-faa83ad90022\") " pod="openshift-service-ca/service-ca-9c57cc56f-4pbr6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082314 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-registration-dir\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082342 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-serving-cert\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082357 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vt2k\" (UniqueName: \"kubernetes.io/projected/834546fa-9104-4394-a674-e0350de62fb1-kube-api-access-9vt2k\") pod \"marketplace-operator-79b997595-gx875\" (UID: \"834546fa-9104-4394-a674-e0350de62fb1\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx875" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082376 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e42464d4-5bc2-41b9-b28c-0dae67783433-profile-collector-cert\") pod \"catalog-operator-68c6474976-2fqws\" (UID: \"e42464d4-5bc2-41b9-b28c-0dae67783433\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082415 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-bound-sa-token\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082452 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0487fb96-92f2-462f-a454-528377db3dd4-proxy-tls\") pod \"machine-config-controller-84d6567774-8qwsm\" (UID: \"0487fb96-92f2-462f-a454-528377db3dd4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082492 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a19f7732-a06f-4b93-b549-4c0a4ccd23fa-trusted-ca\") pod \"console-operator-58897d9998-wsrwg\" (UID: \"a19f7732-a06f-4b93-b549-4c0a4ccd23fa\") " pod="openshift-console-operator/console-operator-58897d9998-wsrwg" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082509 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b92548e-588c-43d5-a99e-9fe9558c8526-config-volume\") pod \"dns-default-459zr\" (UID: \"2b92548e-588c-43d5-a99e-9fe9558c8526\") " pod="openshift-dns/dns-default-459zr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082525 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwh6b\" (UniqueName: \"kubernetes.io/projected/43435d4b-6820-4794-a431-3946e0486b20-kube-api-access-fwh6b\") pod \"olm-operator-6b444d44fb-rrzxd\" (UID: \"43435d4b-6820-4794-a431-3946e0486b20\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082550 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/834546fa-9104-4394-a674-e0350de62fb1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gx875\" (UID: \"834546fa-9104-4394-a674-e0350de62fb1\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx875" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082578 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8855a1b-3e62-4ed3-acb0-eb0663d8df01-metrics-certs\") pod \"router-default-5444994796-zbdx5\" (UID: \"e8855a1b-3e62-4ed3-acb0-eb0663d8df01\") " pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082624 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-registry-certificates\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082645 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w268n\" (UniqueName: \"kubernetes.io/projected/8f9cb473-0c3f-4b38-96ab-5d6243a7aa85-kube-api-access-w268n\") pod \"package-server-manager-789f6589d5-vvtsf\" (UID: \"8f9cb473-0c3f-4b38-96ab-5d6243a7aa85\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082670 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86e1bede-355f-425d-b6c8-b300f7addf32-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t6jr9\" (UID: \"86e1bede-355f-425d-b6c8-b300f7addf32\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082685 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0992c967-65e9-45d8-abe0-80fdb0f4fbcf-certs\") pod \"machine-config-server-d78vr\" (UID: \"0992c967-65e9-45d8-abe0-80fdb0f4fbcf\") " pod="openshift-machine-config-operator/machine-config-server-d78vr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082701 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8ce14c23-57a1-469c-b46c-7b57907dc234-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-nfxm6\" (UID: \"8ce14c23-57a1-469c-b46c-7b57907dc234\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nfxm6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082716 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0487fb96-92f2-462f-a454-528377db3dd4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8qwsm\" (UID: \"0487fb96-92f2-462f-a454-528377db3dd4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082732 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-socket-dir\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082746 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/43435d4b-6820-4794-a431-3946e0486b20-srv-cert\") pod \"olm-operator-6b444d44fb-rrzxd\" (UID: \"43435d4b-6820-4794-a431-3946e0486b20\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082780 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrkfc\" (UniqueName: \"kubernetes.io/projected/74a4246e-6a4f-4e6b-9562-a3746d802003-kube-api-access-xrkfc\") pod \"packageserver-d55dfcdfc-xkst4\" (UID: \"74a4246e-6a4f-4e6b-9562-a3746d802003\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082812 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr9z9\" (UniqueName: \"kubernetes.io/projected/e42464d4-5bc2-41b9-b28c-0dae67783433-kube-api-access-hr9z9\") pod \"catalog-operator-68c6474976-2fqws\" (UID: \"e42464d4-5bc2-41b9-b28c-0dae67783433\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082834 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10c622e2-78c9-42bd-9031-776cded4435c-config-volume\") pod \"collect-profiles-29328255-xfjfm\" (UID: \"10c622e2-78c9-42bd-9031-776cded4435c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082865 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a19f7732-a06f-4b93-b549-4c0a4ccd23fa-config\") pod \"console-operator-58897d9998-wsrwg\" (UID: \"a19f7732-a06f-4b93-b549-4c0a4ccd23fa\") " pod="openshift-console-operator/console-operator-58897d9998-wsrwg" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082880 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhgjl\" (UniqueName: \"kubernetes.io/projected/0487fb96-92f2-462f-a454-528377db3dd4-kube-api-access-lhgjl\") pod \"machine-config-controller-84d6567774-8qwsm\" (UID: \"0487fb96-92f2-462f-a454-528377db3dd4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082903 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/74a4246e-6a4f-4e6b-9562-a3746d802003-webhook-cert\") pod \"packageserver-d55dfcdfc-xkst4\" (UID: \"74a4246e-6a4f-4e6b-9562-a3746d802003\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082918 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg66b\" (UniqueName: \"kubernetes.io/projected/2b92548e-588c-43d5-a99e-9fe9558c8526-kube-api-access-wg66b\") pod \"dns-default-459zr\" (UID: \"2b92548e-588c-43d5-a99e-9fe9558c8526\") " pod="openshift-dns/dns-default-459zr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082932 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-mountpoint-dir\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082976 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86e1bede-355f-425d-b6c8-b300f7addf32-config\") pod \"kube-apiserver-operator-766d6c64bb-t6jr9\" (UID: \"86e1bede-355f-425d-b6c8-b300f7addf32\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.082993 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/74a4246e-6a4f-4e6b-9562-a3746d802003-apiservice-cert\") pod \"packageserver-d55dfcdfc-xkst4\" (UID: \"74a4246e-6a4f-4e6b-9562-a3746d802003\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.083007 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-csi-data-dir\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.083022 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtwgc\" (UniqueName: \"kubernetes.io/projected/8ce14c23-57a1-469c-b46c-7b57907dc234-kube-api-access-rtwgc\") pod \"multus-admission-controller-857f4d67dd-nfxm6\" (UID: \"8ce14c23-57a1-469c-b46c-7b57907dc234\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nfxm6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.083046 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/834546fa-9104-4394-a674-e0350de62fb1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gx875\" (UID: \"834546fa-9104-4394-a674-e0350de62fb1\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx875" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.083063 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0992c967-65e9-45d8-abe0-80fdb0f4fbcf-node-bootstrap-token\") pod \"machine-config-server-d78vr\" (UID: \"0992c967-65e9-45d8-abe0-80fdb0f4fbcf\") " pod="openshift-machine-config-operator/machine-config-server-d78vr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.083094 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.083120 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/43435d4b-6820-4794-a431-3946e0486b20-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rrzxd\" (UID: \"43435d4b-6820-4794-a431-3946e0486b20\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.083167 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7vw9\" (UniqueName: \"kubernetes.io/projected/0992c967-65e9-45d8-abe0-80fdb0f4fbcf-kube-api-access-p7vw9\") pod \"machine-config-server-d78vr\" (UID: \"0992c967-65e9-45d8-abe0-80fdb0f4fbcf\") " pod="openshift-machine-config-operator/machine-config-server-d78vr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.083183 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e42464d4-5bc2-41b9-b28c-0dae67783433-srv-cert\") pod \"catalog-operator-68c6474976-2fqws\" (UID: \"e42464d4-5bc2-41b9-b28c-0dae67783433\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.083198 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvm8w\" (UniqueName: \"kubernetes.io/projected/d3383e66-50dd-4d8a-a24f-faa83ad90022-kube-api-access-vvm8w\") pod \"service-ca-9c57cc56f-4pbr6\" (UID: \"d3383e66-50dd-4d8a-a24f-faa83ad90022\") " pod="openshift-service-ca/service-ca-9c57cc56f-4pbr6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.083213 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-plugins-dir\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.083228 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkzzj\" (UniqueName: \"kubernetes.io/projected/10c622e2-78c9-42bd-9031-776cded4435c-kube-api-access-nkzzj\") pod \"collect-profiles-29328255-xfjfm\" (UID: \"10c622e2-78c9-42bd-9031-776cded4435c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.083985 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-etcd-ca\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.084433 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/322f0415-b81e-4f7a-adec-e49a0460844b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.085171 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8855a1b-3e62-4ed3-acb0-eb0663d8df01-service-ca-bundle\") pod \"router-default-5444994796-zbdx5\" (UID: \"e8855a1b-3e62-4ed3-acb0-eb0663d8df01\") " pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.085235 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s44w8\" (UniqueName: \"kubernetes.io/projected/1f3b053c-4395-46db-9d64-094d422f8145-kube-api-access-s44w8\") pod \"service-ca-operator-777779d784-n7kdk\" (UID: \"1f3b053c-4395-46db-9d64-094d422f8145\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.087726 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-etcd-service-ca\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.087966 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6w7g\" (UniqueName: \"kubernetes.io/projected/e8855a1b-3e62-4ed3-acb0-eb0663d8df01-kube-api-access-f6w7g\") pod \"router-default-5444994796-zbdx5\" (UID: \"e8855a1b-3e62-4ed3-acb0-eb0663d8df01\") " pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.088026 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e8855a1b-3e62-4ed3-acb0-eb0663d8df01-stats-auth\") pod \"router-default-5444994796-zbdx5\" (UID: \"e8855a1b-3e62-4ed3-acb0-eb0663d8df01\") " pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.088075 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.088099 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e8855a1b-3e62-4ed3-acb0-eb0663d8df01-default-certificate\") pod \"router-default-5444994796-zbdx5\" (UID: \"e8855a1b-3e62-4ed3-acb0-eb0663d8df01\") " pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.088120 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86e1bede-355f-425d-b6c8-b300f7addf32-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t6jr9\" (UID: \"86e1bede-355f-425d-b6c8-b300f7addf32\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.088497 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.089061 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8855a1b-3e62-4ed3-acb0-eb0663d8df01-service-ca-bundle\") pod \"router-default-5444994796-zbdx5\" (UID: \"e8855a1b-3e62-4ed3-acb0-eb0663d8df01\") " pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.089485 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/322f0415-b81e-4f7a-adec-e49a0460844b-serving-cert\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.089958 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-registry-certificates\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.092191 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a19f7732-a06f-4b93-b549-4c0a4ccd23fa-trusted-ca\") pod \"console-operator-58897d9998-wsrwg\" (UID: \"a19f7732-a06f-4b93-b549-4c0a4ccd23fa\") " pod="openshift-console-operator/console-operator-58897d9998-wsrwg" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.092348 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86e1bede-355f-425d-b6c8-b300f7addf32-config\") pod \"kube-apiserver-operator-766d6c64bb-t6jr9\" (UID: \"86e1bede-355f-425d-b6c8-b300f7addf32\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9" Oct 05 20:17:13 crc kubenswrapper[4753]: E1005 20:17:13.092577 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:13.592561713 +0000 UTC m=+142.440890035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.092625 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-etcd-client\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.093321 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a19f7732-a06f-4b93-b549-4c0a4ccd23fa-config\") pod \"console-operator-58897d9998-wsrwg\" (UID: \"a19f7732-a06f-4b93-b549-4c0a4ccd23fa\") " pod="openshift-console-operator/console-operator-58897d9998-wsrwg" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.094735 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e8855a1b-3e62-4ed3-acb0-eb0663d8df01-default-certificate\") pod \"router-default-5444994796-zbdx5\" (UID: \"e8855a1b-3e62-4ed3-acb0-eb0663d8df01\") " pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.094794 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.095150 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86e1bede-355f-425d-b6c8-b300f7addf32-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t6jr9\" (UID: \"86e1bede-355f-425d-b6c8-b300f7addf32\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.095698 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a19f7732-a06f-4b93-b549-4c0a4ccd23fa-serving-cert\") pod \"console-operator-58897d9998-wsrwg\" (UID: \"a19f7732-a06f-4b93-b549-4c0a4ccd23fa\") " pod="openshift-console-operator/console-operator-58897d9998-wsrwg" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.107016 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-serving-cert\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.107273 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e8855a1b-3e62-4ed3-acb0-eb0663d8df01-stats-auth\") pod \"router-default-5444994796-zbdx5\" (UID: \"e8855a1b-3e62-4ed3-acb0-eb0663d8df01\") " pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.107343 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8855a1b-3e62-4ed3-acb0-eb0663d8df01-metrics-certs\") pod \"router-default-5444994796-zbdx5\" (UID: \"e8855a1b-3e62-4ed3-acb0-eb0663d8df01\") " pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.107759 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.108519 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/322f0415-b81e-4f7a-adec-e49a0460844b-etcd-client\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.114332 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/322f0415-b81e-4f7a-adec-e49a0460844b-encryption-config\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.115572 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk29h\" (UniqueName: \"kubernetes.io/projected/a19f7732-a06f-4b93-b549-4c0a4ccd23fa-kube-api-access-sk29h\") pod \"console-operator-58897d9998-wsrwg\" (UID: \"a19f7732-a06f-4b93-b549-4c0a4ccd23fa\") " pod="openshift-console-operator/console-operator-58897d9998-wsrwg" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.172379 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw"] Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.173486 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwrbx\" (UniqueName: \"kubernetes.io/projected/322f0415-b81e-4f7a-adec-e49a0460844b-kube-api-access-xwrbx\") pod \"apiserver-7bbb656c7d-k5852\" (UID: \"322f0415-b81e-4f7a-adec-e49a0460844b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.193848 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194052 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e42464d4-5bc2-41b9-b28c-0dae67783433-profile-collector-cert\") pod \"catalog-operator-68c6474976-2fqws\" (UID: \"e42464d4-5bc2-41b9-b28c-0dae67783433\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194076 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vt2k\" (UniqueName: \"kubernetes.io/projected/834546fa-9104-4394-a674-e0350de62fb1-kube-api-access-9vt2k\") pod \"marketplace-operator-79b997595-gx875\" (UID: \"834546fa-9104-4394-a674-e0350de62fb1\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx875" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194107 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0487fb96-92f2-462f-a454-528377db3dd4-proxy-tls\") pod \"machine-config-controller-84d6567774-8qwsm\" (UID: \"0487fb96-92f2-462f-a454-528377db3dd4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194129 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/834546fa-9104-4394-a674-e0350de62fb1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gx875\" (UID: \"834546fa-9104-4394-a674-e0350de62fb1\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx875" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194162 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b92548e-588c-43d5-a99e-9fe9558c8526-config-volume\") pod \"dns-default-459zr\" (UID: \"2b92548e-588c-43d5-a99e-9fe9558c8526\") " pod="openshift-dns/dns-default-459zr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194177 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwh6b\" (UniqueName: \"kubernetes.io/projected/43435d4b-6820-4794-a431-3946e0486b20-kube-api-access-fwh6b\") pod \"olm-operator-6b444d44fb-rrzxd\" (UID: \"43435d4b-6820-4794-a431-3946e0486b20\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194200 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w268n\" (UniqueName: \"kubernetes.io/projected/8f9cb473-0c3f-4b38-96ab-5d6243a7aa85-kube-api-access-w268n\") pod \"package-server-manager-789f6589d5-vvtsf\" (UID: \"8f9cb473-0c3f-4b38-96ab-5d6243a7aa85\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194217 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0992c967-65e9-45d8-abe0-80fdb0f4fbcf-certs\") pod \"machine-config-server-d78vr\" (UID: \"0992c967-65e9-45d8-abe0-80fdb0f4fbcf\") " pod="openshift-machine-config-operator/machine-config-server-d78vr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194231 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8ce14c23-57a1-469c-b46c-7b57907dc234-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-nfxm6\" (UID: \"8ce14c23-57a1-469c-b46c-7b57907dc234\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nfxm6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194249 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0487fb96-92f2-462f-a454-528377db3dd4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8qwsm\" (UID: \"0487fb96-92f2-462f-a454-528377db3dd4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194264 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrkfc\" (UniqueName: \"kubernetes.io/projected/74a4246e-6a4f-4e6b-9562-a3746d802003-kube-api-access-xrkfc\") pod \"packageserver-d55dfcdfc-xkst4\" (UID: \"74a4246e-6a4f-4e6b-9562-a3746d802003\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194279 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-socket-dir\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194294 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/43435d4b-6820-4794-a431-3946e0486b20-srv-cert\") pod \"olm-operator-6b444d44fb-rrzxd\" (UID: \"43435d4b-6820-4794-a431-3946e0486b20\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194308 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr9z9\" (UniqueName: \"kubernetes.io/projected/e42464d4-5bc2-41b9-b28c-0dae67783433-kube-api-access-hr9z9\") pod \"catalog-operator-68c6474976-2fqws\" (UID: \"e42464d4-5bc2-41b9-b28c-0dae67783433\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194323 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10c622e2-78c9-42bd-9031-776cded4435c-config-volume\") pod \"collect-profiles-29328255-xfjfm\" (UID: \"10c622e2-78c9-42bd-9031-776cded4435c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194340 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhgjl\" (UniqueName: \"kubernetes.io/projected/0487fb96-92f2-462f-a454-528377db3dd4-kube-api-access-lhgjl\") pod \"machine-config-controller-84d6567774-8qwsm\" (UID: \"0487fb96-92f2-462f-a454-528377db3dd4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194356 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/74a4246e-6a4f-4e6b-9562-a3746d802003-webhook-cert\") pod \"packageserver-d55dfcdfc-xkst4\" (UID: \"74a4246e-6a4f-4e6b-9562-a3746d802003\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194374 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg66b\" (UniqueName: \"kubernetes.io/projected/2b92548e-588c-43d5-a99e-9fe9558c8526-kube-api-access-wg66b\") pod \"dns-default-459zr\" (UID: \"2b92548e-588c-43d5-a99e-9fe9558c8526\") " pod="openshift-dns/dns-default-459zr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194399 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-mountpoint-dir\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194429 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-csi-data-dir\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194451 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtwgc\" (UniqueName: \"kubernetes.io/projected/8ce14c23-57a1-469c-b46c-7b57907dc234-kube-api-access-rtwgc\") pod \"multus-admission-controller-857f4d67dd-nfxm6\" (UID: \"8ce14c23-57a1-469c-b46c-7b57907dc234\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nfxm6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194467 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/74a4246e-6a4f-4e6b-9562-a3746d802003-apiservice-cert\") pod \"packageserver-d55dfcdfc-xkst4\" (UID: \"74a4246e-6a4f-4e6b-9562-a3746d802003\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194485 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/834546fa-9104-4394-a674-e0350de62fb1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gx875\" (UID: \"834546fa-9104-4394-a674-e0350de62fb1\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx875" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194521 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0992c967-65e9-45d8-abe0-80fdb0f4fbcf-node-bootstrap-token\") pod \"machine-config-server-d78vr\" (UID: \"0992c967-65e9-45d8-abe0-80fdb0f4fbcf\") " pod="openshift-machine-config-operator/machine-config-server-d78vr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194538 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/43435d4b-6820-4794-a431-3946e0486b20-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rrzxd\" (UID: \"43435d4b-6820-4794-a431-3946e0486b20\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194553 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7vw9\" (UniqueName: \"kubernetes.io/projected/0992c967-65e9-45d8-abe0-80fdb0f4fbcf-kube-api-access-p7vw9\") pod \"machine-config-server-d78vr\" (UID: \"0992c967-65e9-45d8-abe0-80fdb0f4fbcf\") " pod="openshift-machine-config-operator/machine-config-server-d78vr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194568 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e42464d4-5bc2-41b9-b28c-0dae67783433-srv-cert\") pod \"catalog-operator-68c6474976-2fqws\" (UID: \"e42464d4-5bc2-41b9-b28c-0dae67783433\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194582 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-plugins-dir\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194597 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvm8w\" (UniqueName: \"kubernetes.io/projected/d3383e66-50dd-4d8a-a24f-faa83ad90022-kube-api-access-vvm8w\") pod \"service-ca-9c57cc56f-4pbr6\" (UID: \"d3383e66-50dd-4d8a-a24f-faa83ad90022\") " pod="openshift-service-ca/service-ca-9c57cc56f-4pbr6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194612 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkzzj\" (UniqueName: \"kubernetes.io/projected/10c622e2-78c9-42bd-9031-776cded4435c-kube-api-access-nkzzj\") pod \"collect-profiles-29328255-xfjfm\" (UID: \"10c622e2-78c9-42bd-9031-776cded4435c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194628 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s44w8\" (UniqueName: \"kubernetes.io/projected/1f3b053c-4395-46db-9d64-094d422f8145-kube-api-access-s44w8\") pod \"service-ca-operator-777779d784-n7kdk\" (UID: \"1f3b053c-4395-46db-9d64-094d422f8145\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194671 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10c622e2-78c9-42bd-9031-776cded4435c-secret-volume\") pod \"collect-profiles-29328255-xfjfm\" (UID: \"10c622e2-78c9-42bd-9031-776cded4435c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194684 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b92548e-588c-43d5-a99e-9fe9558c8526-metrics-tls\") pod \"dns-default-459zr\" (UID: \"2b92548e-588c-43d5-a99e-9fe9558c8526\") " pod="openshift-dns/dns-default-459zr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194697 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/170ae05d-7931-4f6b-a1b1-d4cf1c6709ba-cert\") pod \"ingress-canary-9wgjd\" (UID: \"170ae05d-7931-4f6b-a1b1-d4cf1c6709ba\") " pod="openshift-ingress-canary/ingress-canary-9wgjd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194716 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn4mc\" (UniqueName: \"kubernetes.io/projected/170ae05d-7931-4f6b-a1b1-d4cf1c6709ba-kube-api-access-xn4mc\") pod \"ingress-canary-9wgjd\" (UID: \"170ae05d-7931-4f6b-a1b1-d4cf1c6709ba\") " pod="openshift-ingress-canary/ingress-canary-9wgjd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194733 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f9cb473-0c3f-4b38-96ab-5d6243a7aa85-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vvtsf\" (UID: \"8f9cb473-0c3f-4b38-96ab-5d6243a7aa85\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194751 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f3b053c-4395-46db-9d64-094d422f8145-serving-cert\") pod \"service-ca-operator-777779d784-n7kdk\" (UID: \"1f3b053c-4395-46db-9d64-094d422f8145\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194765 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f3b053c-4395-46db-9d64-094d422f8145-config\") pod \"service-ca-operator-777779d784-n7kdk\" (UID: \"1f3b053c-4395-46db-9d64-094d422f8145\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194788 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbw7r\" (UniqueName: \"kubernetes.io/projected/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-kube-api-access-wbw7r\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194815 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d3383e66-50dd-4d8a-a24f-faa83ad90022-signing-key\") pod \"service-ca-9c57cc56f-4pbr6\" (UID: \"d3383e66-50dd-4d8a-a24f-faa83ad90022\") " pod="openshift-service-ca/service-ca-9c57cc56f-4pbr6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194833 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/74a4246e-6a4f-4e6b-9562-a3746d802003-tmpfs\") pod \"packageserver-d55dfcdfc-xkst4\" (UID: \"74a4246e-6a4f-4e6b-9562-a3746d802003\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194853 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d3383e66-50dd-4d8a-a24f-faa83ad90022-signing-cabundle\") pod \"service-ca-9c57cc56f-4pbr6\" (UID: \"d3383e66-50dd-4d8a-a24f-faa83ad90022\") " pod="openshift-service-ca/service-ca-9c57cc56f-4pbr6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.194872 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-registration-dir\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.195060 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-registration-dir\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: E1005 20:17:13.195125 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:13.695113262 +0000 UTC m=+142.543441494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.207007 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-bound-sa-token\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.208516 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/74a4246e-6a4f-4e6b-9562-a3746d802003-apiservice-cert\") pod \"packageserver-d55dfcdfc-xkst4\" (UID: \"74a4246e-6a4f-4e6b-9562-a3746d802003\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.208600 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-socket-dir\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.209783 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm"] Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.211178 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10c622e2-78c9-42bd-9031-776cded4435c-config-volume\") pod \"collect-profiles-29328255-xfjfm\" (UID: \"10c622e2-78c9-42bd-9031-776cded4435c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.211238 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/834546fa-9104-4394-a674-e0350de62fb1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gx875\" (UID: \"834546fa-9104-4394-a674-e0350de62fb1\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx875" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.211344 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-mountpoint-dir\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.211396 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-csi-data-dir\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.212290 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/74a4246e-6a4f-4e6b-9562-a3746d802003-tmpfs\") pod \"packageserver-d55dfcdfc-xkst4\" (UID: \"74a4246e-6a4f-4e6b-9562-a3746d802003\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.213003 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d3383e66-50dd-4d8a-a24f-faa83ad90022-signing-cabundle\") pod \"service-ca-9c57cc56f-4pbr6\" (UID: \"d3383e66-50dd-4d8a-a24f-faa83ad90022\") " pod="openshift-service-ca/service-ca-9c57cc56f-4pbr6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.213048 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-plugins-dir\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.217014 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e42464d4-5bc2-41b9-b28c-0dae67783433-profile-collector-cert\") pod \"catalog-operator-68c6474976-2fqws\" (UID: \"e42464d4-5bc2-41b9-b28c-0dae67783433\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.218086 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0487fb96-92f2-462f-a454-528377db3dd4-proxy-tls\") pod \"machine-config-controller-84d6567774-8qwsm\" (UID: \"0487fb96-92f2-462f-a454-528377db3dd4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.221278 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f3b053c-4395-46db-9d64-094d422f8145-config\") pod \"service-ca-operator-777779d784-n7kdk\" (UID: \"1f3b053c-4395-46db-9d64-094d422f8145\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.222566 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b92548e-588c-43d5-a99e-9fe9558c8526-config-volume\") pod \"dns-default-459zr\" (UID: \"2b92548e-588c-43d5-a99e-9fe9558c8526\") " pod="openshift-dns/dns-default-459zr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.223614 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0992c967-65e9-45d8-abe0-80fdb0f4fbcf-node-bootstrap-token\") pod \"machine-config-server-d78vr\" (UID: \"0992c967-65e9-45d8-abe0-80fdb0f4fbcf\") " pod="openshift-machine-config-operator/machine-config-server-d78vr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.224078 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10c622e2-78c9-42bd-9031-776cded4435c-secret-volume\") pod \"collect-profiles-29328255-xfjfm\" (UID: \"10c622e2-78c9-42bd-9031-776cded4435c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.225218 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0992c967-65e9-45d8-abe0-80fdb0f4fbcf-certs\") pod \"machine-config-server-d78vr\" (UID: \"0992c967-65e9-45d8-abe0-80fdb0f4fbcf\") " pod="openshift-machine-config-operator/machine-config-server-d78vr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.228624 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/74a4246e-6a4f-4e6b-9562-a3746d802003-webhook-cert\") pod \"packageserver-d55dfcdfc-xkst4\" (UID: \"74a4246e-6a4f-4e6b-9562-a3746d802003\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.232674 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0487fb96-92f2-462f-a454-528377db3dd4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8qwsm\" (UID: \"0487fb96-92f2-462f-a454-528377db3dd4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.232910 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f3b053c-4395-46db-9d64-094d422f8145-serving-cert\") pod \"service-ca-operator-777779d784-n7kdk\" (UID: \"1f3b053c-4395-46db-9d64-094d422f8145\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.242453 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/43435d4b-6820-4794-a431-3946e0486b20-srv-cert\") pod \"olm-operator-6b444d44fb-rrzxd\" (UID: \"43435d4b-6820-4794-a431-3946e0486b20\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.242831 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e42464d4-5bc2-41b9-b28c-0dae67783433-srv-cert\") pod \"catalog-operator-68c6474976-2fqws\" (UID: \"e42464d4-5bc2-41b9-b28c-0dae67783433\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.243310 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f9cb473-0c3f-4b38-96ab-5d6243a7aa85-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vvtsf\" (UID: \"8f9cb473-0c3f-4b38-96ab-5d6243a7aa85\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.244536 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/43435d4b-6820-4794-a431-3946e0486b20-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rrzxd\" (UID: \"43435d4b-6820-4794-a431-3946e0486b20\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.249494 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcdlz\" (UniqueName: \"kubernetes.io/projected/7da48090-042e-4fef-afdf-9e6e54a89fe2-kube-api-access-qcdlz\") pod \"control-plane-machine-set-operator-78cbb6b69f-h7rzp\" (UID: \"7da48090-042e-4fef-afdf-9e6e54a89fe2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h7rzp" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.249654 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d3383e66-50dd-4d8a-a24f-faa83ad90022-signing-key\") pod \"service-ca-9c57cc56f-4pbr6\" (UID: \"d3383e66-50dd-4d8a-a24f-faa83ad90022\") " pod="openshift-service-ca/service-ca-9c57cc56f-4pbr6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.250004 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/834546fa-9104-4394-a674-e0350de62fb1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gx875\" (UID: \"834546fa-9104-4394-a674-e0350de62fb1\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx875" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.250369 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b92548e-588c-43d5-a99e-9fe9558c8526-metrics-tls\") pod \"dns-default-459zr\" (UID: \"2b92548e-588c-43d5-a99e-9fe9558c8526\") " pod="openshift-dns/dns-default-459zr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.250388 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8ce14c23-57a1-469c-b46c-7b57907dc234-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-nfxm6\" (UID: \"8ce14c23-57a1-469c-b46c-7b57907dc234\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nfxm6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.254110 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/170ae05d-7931-4f6b-a1b1-d4cf1c6709ba-cert\") pod \"ingress-canary-9wgjd\" (UID: \"170ae05d-7931-4f6b-a1b1-d4cf1c6709ba\") " pod="openshift-ingress-canary/ingress-canary-9wgjd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.258079 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2hd6\" (UniqueName: \"kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-kube-api-access-p2hd6\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.265880 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86e1bede-355f-425d-b6c8-b300f7addf32-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t6jr9\" (UID: \"86e1bede-355f-425d-b6c8-b300f7addf32\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.265978 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvhg2\" (UniqueName: \"kubernetes.io/projected/53527faf-a0ea-4edf-9cfa-74ae88f1b3cb-kube-api-access-gvhg2\") pod \"etcd-operator-b45778765-rwrj7\" (UID: \"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.282608 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6w7g\" (UniqueName: \"kubernetes.io/projected/e8855a1b-3e62-4ed3-acb0-eb0663d8df01-kube-api-access-f6w7g\") pod \"router-default-5444994796-zbdx5\" (UID: \"e8855a1b-3e62-4ed3-acb0-eb0663d8df01\") " pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.297349 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: E1005 20:17:13.297869 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:13.797857656 +0000 UTC m=+142.646185888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.318130 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrkfc\" (UniqueName: \"kubernetes.io/projected/74a4246e-6a4f-4e6b-9562-a3746d802003-kube-api-access-xrkfc\") pod \"packageserver-d55dfcdfc-xkst4\" (UID: \"74a4246e-6a4f-4e6b-9562-a3746d802003\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.370481 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vt2k\" (UniqueName: \"kubernetes.io/projected/834546fa-9104-4394-a674-e0350de62fb1-kube-api-access-9vt2k\") pod \"marketplace-operator-79b997595-gx875\" (UID: \"834546fa-9104-4394-a674-e0350de62fb1\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx875" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.375324 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr9z9\" (UniqueName: \"kubernetes.io/projected/e42464d4-5bc2-41b9-b28c-0dae67783433-kube-api-access-hr9z9\") pod \"catalog-operator-68c6474976-2fqws\" (UID: \"e42464d4-5bc2-41b9-b28c-0dae67783433\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.378475 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.410010 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wsrwg" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.412017 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.416954 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm"] Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.421615 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:13 crc kubenswrapper[4753]: E1005 20:17:13.422609 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:13.922591287 +0000 UTC m=+142.770919509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.422823 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhgjl\" (UniqueName: \"kubernetes.io/projected/0487fb96-92f2-462f-a454-528377db3dd4-kube-api-access-lhgjl\") pod \"machine-config-controller-84d6567774-8qwsm\" (UID: \"0487fb96-92f2-462f-a454-528377db3dd4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.423187 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.426197 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.432437 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xjp5g"] Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.445391 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtwgc\" (UniqueName: \"kubernetes.io/projected/8ce14c23-57a1-469c-b46c-7b57907dc234-kube-api-access-rtwgc\") pod \"multus-admission-controller-857f4d67dd-nfxm6\" (UID: \"8ce14c23-57a1-469c-b46c-7b57907dc234\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nfxm6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.446580 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbw7r\" (UniqueName: \"kubernetes.io/projected/33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8-kube-api-access-wbw7r\") pod \"csi-hostpathplugin-w2sqj\" (UID: \"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8\") " pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.457901 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx"] Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.463658 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w268n\" (UniqueName: \"kubernetes.io/projected/8f9cb473-0c3f-4b38-96ab-5d6243a7aa85-kube-api-access-w268n\") pod \"package-server-manager-789f6589d5-vvtsf\" (UID: \"8f9cb473-0c3f-4b38-96ab-5d6243a7aa85\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.464046 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg66b\" (UniqueName: \"kubernetes.io/projected/2b92548e-588c-43d5-a99e-9fe9558c8526-kube-api-access-wg66b\") pod \"dns-default-459zr\" (UID: \"2b92548e-588c-43d5-a99e-9fe9558c8526\") " pod="openshift-dns/dns-default-459zr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.470679 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h7rzp" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.479612 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s44w8\" (UniqueName: \"kubernetes.io/projected/1f3b053c-4395-46db-9d64-094d422f8145-kube-api-access-s44w8\") pod \"service-ca-operator-777779d784-n7kdk\" (UID: \"1f3b053c-4395-46db-9d64-094d422f8145\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.481065 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-nfxm6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.493117 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.496895 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvm8w\" (UniqueName: \"kubernetes.io/projected/d3383e66-50dd-4d8a-a24f-faa83ad90022-kube-api-access-vvm8w\") pod \"service-ca-9c57cc56f-4pbr6\" (UID: \"d3383e66-50dd-4d8a-a24f-faa83ad90022\") " pod="openshift-service-ca/service-ca-9c57cc56f-4pbr6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.515790 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.516489 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.519038 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.523475 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: E1005 20:17:13.523763 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:14.023751594 +0000 UTC m=+142.872079826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.525931 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.534128 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gx875" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.536633 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkzzj\" (UniqueName: \"kubernetes.io/projected/10c622e2-78c9-42bd-9031-776cded4435c-kube-api-access-nkzzj\") pod \"collect-profiles-29328255-xfjfm\" (UID: \"10c622e2-78c9-42bd-9031-776cded4435c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.543554 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4pbr6" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.545088 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7vw9\" (UniqueName: \"kubernetes.io/projected/0992c967-65e9-45d8-abe0-80fdb0f4fbcf-kube-api-access-p7vw9\") pod \"machine-config-server-d78vr\" (UID: \"0992c967-65e9-45d8-abe0-80fdb0f4fbcf\") " pod="openshift-machine-config-operator/machine-config-server-d78vr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.549968 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.555938 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn4mc\" (UniqueName: \"kubernetes.io/projected/170ae05d-7931-4f6b-a1b1-d4cf1c6709ba-kube-api-access-xn4mc\") pod \"ingress-canary-9wgjd\" (UID: \"170ae05d-7931-4f6b-a1b1-d4cf1c6709ba\") " pod="openshift-ingress-canary/ingress-canary-9wgjd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.560616 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9wgjd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.573669 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwh6b\" (UniqueName: \"kubernetes.io/projected/43435d4b-6820-4794-a431-3946e0486b20-kube-api-access-fwh6b\") pod \"olm-operator-6b444d44fb-rrzxd\" (UID: \"43435d4b-6820-4794-a431-3946e0486b20\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.580431 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.587575 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-d78vr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.594056 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-459zr" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.608310 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q9dxv" event={"ID":"f647e6b6-7d7f-4c72-9506-af98598583fc","Type":"ContainerStarted","Data":"3dc2e9cc26381ea1ec3462849293da6682d2c659e2d5b6f9c1db74ae975a9428"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.612783 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-66zm9" event={"ID":"06068c71-eeec-4220-8279-579f47741023","Type":"ContainerStarted","Data":"3e0f5970637b8966fdeb25794762b4d2ba39f309e1f85c9df5712555400eadda"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.612825 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-66zm9" event={"ID":"06068c71-eeec-4220-8279-579f47741023","Type":"ContainerStarted","Data":"da43d579107edc24d5d1ec57dc973d43f90eb0ceac76f6750994e2ceb4fef40e"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.612835 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-66zm9" event={"ID":"06068c71-eeec-4220-8279-579f47741023","Type":"ContainerStarted","Data":"8bda552a5966c7e59065be2d30fc415b5a260c26a698d1a5bb3ced789efe5e47"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.620066 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" event={"ID":"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2","Type":"ContainerStarted","Data":"77d0d6b1bf87cd470462bd298023aa406ffc4be5c01b9aa6c67a5b9228179b40"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.620096 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" event={"ID":"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2","Type":"ContainerStarted","Data":"30ff1ada779e08e24d683997af2522163b5541d6439bf424df6ef278626ec522"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.620321 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.621997 4753 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-tpd8r container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.20:6443/healthz\": dial tcp 10.217.0.20:6443: connect: connection refused" start-of-body= Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.622037 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" podUID="a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.20:6443/healthz\": dial tcp 10.217.0.20:6443: connect: connection refused" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.622402 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l" event={"ID":"91f436b8-10e7-4be0-9fd5-3047c5fafa45","Type":"ContainerStarted","Data":"9c08b863b49b1ec2041bccb52dfa5bd2a8c023e938c14889fdda46fbca433e59"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.622430 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l" event={"ID":"91f436b8-10e7-4be0-9fd5-3047c5fafa45","Type":"ContainerStarted","Data":"9779b336508ca26fd2de4e2b6906da3f7d2ab5181367b324fb66f63f218489dd"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.624535 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:13 crc kubenswrapper[4753]: E1005 20:17:13.625594 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:14.12557552 +0000 UTC m=+142.973903752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.628670 4753 generic.go:334] "Generic (PLEG): container finished" podID="ad58fa0b-b61d-4afa-bf14-f82b2b976bdd" containerID="bc7186acd7a12170a89ed88e07a0a25116aa7bd832fc17f0a6125af1f3dd3b8e" exitCode=0 Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.628952 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" event={"ID":"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd","Type":"ContainerDied","Data":"bc7186acd7a12170a89ed88e07a0a25116aa7bd832fc17f0a6125af1f3dd3b8e"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.630306 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjp5g" event={"ID":"99413e5b-15a4-40f6-b7a5-2dbee5eafb1d","Type":"ContainerStarted","Data":"578bb3ebc58d8fa3eb8134ea15a8e7204962fd29f8440fda092c2091230df90d"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.639555 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r" event={"ID":"a2a48e1a-ef95-45ff-89b4-6779d95a2096","Type":"ContainerStarted","Data":"45bd49a696981c1239c5c2e2bae170ce57006702fb3da1c6cd43335f178643bb"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.644331 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" event={"ID":"9a13243e-13d5-4a22-9417-8dfc3896f332","Type":"ContainerStarted","Data":"529e596313bfbabd8afb8bdcf4e7f80368dc5bf499349a0f8e2abaef20f6dd24"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.655267 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7klvp" event={"ID":"eb329af5-99e8-42d9-b79e-4c9acd09204d","Type":"ContainerStarted","Data":"86a62bfa2597b4fc213a91e7905006bd9f7f4d3472c15cdd88f2a08ef9d9fc7e"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.658797 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm" event={"ID":"ef7125ff-9b89-4972-954b-61145623ecec","Type":"ContainerStarted","Data":"06cec6b06133c3bd617fbdf90628a1ab8c7d41d4cdbe9def64a96afde2213e6c"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.662839 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" event={"ID":"6e73e9c7-3af4-4b10-a331-7899608702b3","Type":"ContainerStarted","Data":"a094bddebbfc33e5c4739e6518c6ca2f92f9f90cd83c380bb7e7d18f17095679"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.662866 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" event={"ID":"6e73e9c7-3af4-4b10-a331-7899608702b3","Type":"ContainerStarted","Data":"aecff357a6639c01f4249264a28698e8b9f24e81b4039c03ae46a9d5aa221f95"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.663526 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.670188 4753 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-x9jfj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.670230 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" podUID="6e73e9c7-3af4-4b10-a331-7899608702b3" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.691000 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" event={"ID":"f1b80304-1bf0-464a-b196-392d4c4a6e6b","Type":"ContainerStarted","Data":"8fffcecd9fd9dadaf17293b9e568bf5aa913c2db387be045b11838e59827cb3e"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.691066 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" event={"ID":"f1b80304-1bf0-464a-b196-392d4c4a6e6b","Type":"ContainerStarted","Data":"e40d0948b373d6430cba8b336859ba574bdb64b5c754348e9e1b02a2c8df25ae"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.696935 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bnpxs" event={"ID":"0cae56c9-6ca6-49f3-97d2-14e8a6748315","Type":"ContainerStarted","Data":"20731fd87a9a5563e42e9e4e9275c5450c31239e7f8f32afe81e2ff6aaa6ac97"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.696990 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bnpxs" event={"ID":"0cae56c9-6ca6-49f3-97d2-14e8a6748315","Type":"ContainerStarted","Data":"35de633c664a498e56527909fb32593d6c5fb0c992f589348783d50d9fa93ddd"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.698706 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-bnpxs" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.704762 4753 patch_prober.go:28] interesting pod/downloads-7954f5f757-bnpxs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.704811 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bnpxs" podUID="0cae56c9-6ca6-49f3-97d2-14e8a6748315" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.716358 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" event={"ID":"5615672e-59bf-448b-88ba-75a02438a8ad","Type":"ContainerStarted","Data":"6edf788c0d9a70295f8b4c577424f1378038a37a0d1d9a8def15784b8bc61bdc"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.716411 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" event={"ID":"5615672e-59bf-448b-88ba-75a02438a8ad","Type":"ContainerStarted","Data":"4af13d8f75de63241690b46c2ae5b7f0ae37477f1a6d0121e44b05057a1bc417"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.717424 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.726574 4753 generic.go:334] "Generic (PLEG): container finished" podID="a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5" containerID="cfc861c9de152d11a2dbd634beeb73247884ac42f56e7ee6a9dd531fe7461efc" exitCode=0 Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.726846 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.727168 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" event={"ID":"a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5","Type":"ContainerDied","Data":"cfc861c9de152d11a2dbd634beeb73247884ac42f56e7ee6a9dd531fe7461efc"} Oct 05 20:17:13 crc kubenswrapper[4753]: E1005 20:17:13.728234 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:14.228216981 +0000 UTC m=+143.076545213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.731319 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw" event={"ID":"cd241b3b-a68f-487b-bcd8-ca61782d9e4f","Type":"ContainerStarted","Data":"5f0aa1c209afae562befd189076cfaeb1e226cddc8977091752a32287bf7924d"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.742824 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" event={"ID":"22db54ee-7d52-475e-a824-9e563b2920e8","Type":"ContainerStarted","Data":"b7b8778504487b936cb3787632fb42d57b1da66f45de376b0bf164093e7b1912"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.776444 4753 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-jcc7v container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.777033 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" podUID="5615672e-59bf-448b-88ba-75a02438a8ad" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.790707 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" event={"ID":"b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5","Type":"ContainerStarted","Data":"a7d2ffac19fc15fcf2d71b55ea0c2d8d76cd26e89f413326617133089eec0b7e"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.797495 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.801434 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" event={"ID":"6f65814d-e7b3-425a-b7a5-86c7e062ad35","Type":"ContainerStarted","Data":"6ce40df0a89d9024fd4253b278a732a31efe007dd34315db43dbf68b40c5ab26"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.801482 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" event={"ID":"6f65814d-e7b3-425a-b7a5-86c7e062ad35","Type":"ContainerStarted","Data":"7a65ba5030aa0155b021f4c96c390eade5af16ccf684d03be84ac6e78cce35ed"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.830115 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:13 crc kubenswrapper[4753]: E1005 20:17:13.832528 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:14.332503173 +0000 UTC m=+143.180831405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.839467 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk"] Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.874347 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" event={"ID":"3a0f0082-3cf0-4339-9964-02916c137f45","Type":"ContainerStarted","Data":"661bdb63f3e0b1e3508460e5548900df3a3214cafd8ed0dd766d1c154e2b4c38"} Oct 05 20:17:13 crc kubenswrapper[4753]: I1005 20:17:13.932690 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:13 crc kubenswrapper[4753]: E1005 20:17:13.934024 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:14.4340133 +0000 UTC m=+143.282341532 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:13 crc kubenswrapper[4753]: W1005 20:17:13.963117 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8855a1b_3e62_4ed3_acb0_eb0663d8df01.slice/crio-c174d37dfc3e049a97d561c59c72773b502131258495893ef050a1ba65c1826a WatchSource:0}: Error finding container c174d37dfc3e049a97d561c59c72773b502131258495893ef050a1ba65c1826a: Status 404 returned error can't find the container with id c174d37dfc3e049a97d561c59c72773b502131258495893ef050a1ba65c1826a Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.036673 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:14 crc kubenswrapper[4753]: E1005 20:17:14.037406 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:14.537391004 +0000 UTC m=+143.385719236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.037456 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:14 crc kubenswrapper[4753]: E1005 20:17:14.037797 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:14.537785966 +0000 UTC m=+143.386114198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:14 crc kubenswrapper[4753]: W1005 20:17:14.092554 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b5d8bc9_c226_4476_a309_fb21a5f79af3.slice/crio-3b9b62c5364381fa3fb4e590cc34ff7d73d050b6405d94109f7eef2873625c54 WatchSource:0}: Error finding container 3b9b62c5364381fa3fb4e590cc34ff7d73d050b6405d94109f7eef2873625c54: Status 404 returned error can't find the container with id 3b9b62c5364381fa3fb4e590cc34ff7d73d050b6405d94109f7eef2873625c54 Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.115120 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852"] Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.139722 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:14 crc kubenswrapper[4753]: E1005 20:17:14.140344 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:14.640131678 +0000 UTC m=+143.488459910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.161248 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-gnr4l" podStartSLOduration=121.161215107 podStartE2EDuration="2m1.161215107s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:14.124953967 +0000 UTC m=+142.973282199" watchObservedRunningTime="2025-10-05 20:17:14.161215107 +0000 UTC m=+143.009543339" Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.196579 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rwrj7"] Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.242325 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:14 crc kubenswrapper[4753]: E1005 20:17:14.242764 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:14.742746048 +0000 UTC m=+143.591074360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.242862 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-7klvp" podStartSLOduration=121.242827521 podStartE2EDuration="2m1.242827521s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:14.206906022 +0000 UTC m=+143.055234244" watchObservedRunningTime="2025-10-05 20:17:14.242827521 +0000 UTC m=+143.091155753" Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.264611 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wsrwg"] Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.344992 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:14 crc kubenswrapper[4753]: E1005 20:17:14.345307 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:14.845293897 +0000 UTC m=+143.693622129 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:14 crc kubenswrapper[4753]: W1005 20:17:14.362930 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53527faf_a0ea_4edf_9cfa_74ae88f1b3cb.slice/crio-dcf9309781a64cfe4a9711419382ef199d1614f19e4dbe570964839210f031b8 WatchSource:0}: Error finding container dcf9309781a64cfe4a9711419382ef199d1614f19e4dbe570964839210f031b8: Status 404 returned error can't find the container with id dcf9309781a64cfe4a9711419382ef199d1614f19e4dbe570964839210f031b8 Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.443961 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9"] Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.446367 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:14 crc kubenswrapper[4753]: E1005 20:17:14.446681 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:14.94667003 +0000 UTC m=+143.794998262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.498235 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" podStartSLOduration=121.498220742 podStartE2EDuration="2m1.498220742s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:14.49519642 +0000 UTC m=+143.343524652" watchObservedRunningTime="2025-10-05 20:17:14.498220742 +0000 UTC m=+143.346548974" Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.548646 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:14 crc kubenswrapper[4753]: E1005 20:17:14.549029 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:15.049015312 +0000 UTC m=+143.897343544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.573682 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" podStartSLOduration=120.573665129 podStartE2EDuration="2m0.573665129s" podCreationTimestamp="2025-10-05 20:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:14.534065699 +0000 UTC m=+143.382393941" watchObservedRunningTime="2025-10-05 20:17:14.573665129 +0000 UTC m=+143.421993361" Oct 05 20:17:14 crc kubenswrapper[4753]: W1005 20:17:14.594816 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86e1bede_355f_425d_b6c8_b300f7addf32.slice/crio-71975b15d288ffab2d64bb2c82bda49bf09e1a8341fb9e523f55b322d94cf322 WatchSource:0}: Error finding container 71975b15d288ffab2d64bb2c82bda49bf09e1a8341fb9e523f55b322d94cf322: Status 404 returned error can't find the container with id 71975b15d288ffab2d64bb2c82bda49bf09e1a8341fb9e523f55b322d94cf322 Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.594835 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" podStartSLOduration=121.594821461 podStartE2EDuration="2m1.594821461s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:14.590320154 +0000 UTC m=+143.438648386" watchObservedRunningTime="2025-10-05 20:17:14.594821461 +0000 UTC m=+143.443149693" Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.610607 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-nfxm6"] Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.679390 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:14 crc kubenswrapper[4753]: E1005 20:17:14.679912 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:15.17990183 +0000 UTC m=+144.028230062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.785825 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:14 crc kubenswrapper[4753]: E1005 20:17:14.786334 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:15.286305315 +0000 UTC m=+144.134633547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.825654 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w2sqj"] Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.892396 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:14 crc kubenswrapper[4753]: E1005 20:17:14.892700 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:15.39268827 +0000 UTC m=+144.241016502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.964529 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-d8b6f" podStartSLOduration=120.964507917 podStartE2EDuration="2m0.964507917s" podCreationTimestamp="2025-10-05 20:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:14.964019982 +0000 UTC m=+143.812348214" watchObservedRunningTime="2025-10-05 20:17:14.964507917 +0000 UTC m=+143.812836149" Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.995763 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:14 crc kubenswrapper[4753]: E1005 20:17:14.996053 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:15.496037293 +0000 UTC m=+144.344365525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:14 crc kubenswrapper[4753]: I1005 20:17:14.996262 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:14 crc kubenswrapper[4753]: E1005 20:17:14.996890 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:15.496874798 +0000 UTC m=+144.345203030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.027669 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-bnpxs" podStartSLOduration=122.027651791 podStartE2EDuration="2m2.027651791s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:14.995536378 +0000 UTC m=+143.843864610" watchObservedRunningTime="2025-10-05 20:17:15.027651791 +0000 UTC m=+143.875980023" Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.046410 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-nfxm6" event={"ID":"8ce14c23-57a1-469c-b46c-7b57907dc234","Type":"ContainerStarted","Data":"22093ca63df599771b610874eaf68e5267dcd3e8633467c6130ae078908148c9"} Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.053705 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-9w7kh" podStartSLOduration=122.053684831 podStartE2EDuration="2m2.053684831s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:15.047009768 +0000 UTC m=+143.895337990" watchObservedRunningTime="2025-10-05 20:17:15.053684831 +0000 UTC m=+143.902013063" Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.080527 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk" event={"ID":"5b5d8bc9-c226-4476-a309-fb21a5f79af3","Type":"ContainerStarted","Data":"3b9b62c5364381fa3fb4e590cc34ff7d73d050b6405d94109f7eef2873625c54"} Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.100860 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:15 crc kubenswrapper[4753]: E1005 20:17:15.101224 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:15.60120502 +0000 UTC m=+144.449533252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.108568 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" event={"ID":"322f0415-b81e-4f7a-adec-e49a0460844b","Type":"ContainerStarted","Data":"71dbae794b0017212078638d7f9b2775fbe38c25b74c7690c0fa27f6d5fb4eb7"} Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.109574 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4"] Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.117311 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-66zm9" podStartSLOduration=122.117295299 podStartE2EDuration="2m2.117295299s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:15.113533585 +0000 UTC m=+143.961861817" watchObservedRunningTime="2025-10-05 20:17:15.117295299 +0000 UTC m=+143.965623541" Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.169362 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" event={"ID":"b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5","Type":"ContainerStarted","Data":"41a03c3164dc194273040b10af3829ebce8dbbb844c831bdfe4e31c0b00db9b4"} Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.190217 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wsrwg" event={"ID":"a19f7732-a06f-4b93-b549-4c0a4ccd23fa","Type":"ContainerStarted","Data":"61b570ea88e980c2beb4104b3a0d902a3d341b4b3eeab14544d990acc1ec4ff9"} Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.203343 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:15 crc kubenswrapper[4753]: E1005 20:17:15.204748 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:15.704736709 +0000 UTC m=+144.553064941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.271750 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.277767 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" event={"ID":"3a0f0082-3cf0-4339-9964-02916c137f45","Type":"ContainerStarted","Data":"b5609a395f431671306f1a0248d8ea81fba7366f00228f091ef6d16f64a62ef2"} Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.304185 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:15 crc kubenswrapper[4753]: E1005 20:17:15.304344 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:15.804305607 +0000 UTC m=+144.652633849 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.304375 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:15 crc kubenswrapper[4753]: E1005 20:17:15.304729 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:15.804714659 +0000 UTC m=+144.653042891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.304888 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" event={"ID":"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb","Type":"ContainerStarted","Data":"dcf9309781a64cfe4a9711419382ef199d1614f19e4dbe570964839210f031b8"} Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.331310 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" event={"ID":"9a13243e-13d5-4a22-9417-8dfc3896f332","Type":"ContainerStarted","Data":"0b8534d1987954a6564c26d5d385e0a42dd2488c91baa8012df272c868d39ae9"} Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.377722 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gx875"] Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.381308 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zbdx5" event={"ID":"e8855a1b-3e62-4ed3-acb0-eb0663d8df01","Type":"ContainerStarted","Data":"c174d37dfc3e049a97d561c59c72773b502131258495893ef050a1ba65c1826a"} Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.392707 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws"] Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.408906 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:15 crc kubenswrapper[4753]: E1005 20:17:15.409409 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:15.909391153 +0000 UTC m=+144.757719385 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.426278 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.426487 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.426526 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.447218 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h278r" podStartSLOduration=122.447200999 podStartE2EDuration="2m2.447200999s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:15.409792215 +0000 UTC m=+144.258120457" watchObservedRunningTime="2025-10-05 20:17:15.447200999 +0000 UTC m=+144.295529231" Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.452225 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-d78vr" event={"ID":"0992c967-65e9-45d8-abe0-80fdb0f4fbcf","Type":"ContainerStarted","Data":"b8448f3a661a1e5d45ae30b36902e75ca3a72f1a65d2d52d73afe9ffd8cba179"} Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.477915 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf"] Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.486356 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h7rzp"] Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.513953 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:15 crc kubenswrapper[4753]: E1005 20:17:15.515259 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:16.015248622 +0000 UTC m=+144.863576854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.572367 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8h7qr" podStartSLOduration=122.572345162 podStartE2EDuration="2m2.572345162s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:15.510896799 +0000 UTC m=+144.359225041" watchObservedRunningTime="2025-10-05 20:17:15.572345162 +0000 UTC m=+144.420673394" Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.592451 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm"] Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.592508 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk"] Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.613104 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-zbdx5" podStartSLOduration=122.613086147 podStartE2EDuration="2m2.613086147s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:15.609771647 +0000 UTC m=+144.458099879" watchObservedRunningTime="2025-10-05 20:17:15.613086147 +0000 UTC m=+144.461414379" Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.613384 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm"] Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.615694 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:15 crc kubenswrapper[4753]: E1005 20:17:15.616036 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:16.116016506 +0000 UTC m=+144.964344728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.668877 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" podStartSLOduration=122.668859728 podStartE2EDuration="2m2.668859728s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:15.666445075 +0000 UTC m=+144.514773307" watchObservedRunningTime="2025-10-05 20:17:15.668859728 +0000 UTC m=+144.517187960" Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.670910 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9wgjd"] Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.675020 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw" event={"ID":"cd241b3b-a68f-487b-bcd8-ca61782d9e4f","Type":"ContainerStarted","Data":"7a37ca1c14ed0d2fc0cf1d532596cdcafa993158fb7b76e3c0531881c17d60ac"} Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.733806 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:15 crc kubenswrapper[4753]: E1005 20:17:15.734831 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:16.234809637 +0000 UTC m=+145.083137869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.764159 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zpdnw" podStartSLOduration=122.764122476 podStartE2EDuration="2m2.764122476s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:15.740618443 +0000 UTC m=+144.588946675" watchObservedRunningTime="2025-10-05 20:17:15.764122476 +0000 UTC m=+144.612450708" Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.765759 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd"] Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.792077 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9" event={"ID":"86e1bede-355f-425d-b6c8-b300f7addf32","Type":"ContainerStarted","Data":"71975b15d288ffab2d64bb2c82bda49bf09e1a8341fb9e523f55b322d94cf322"} Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.827067 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-459zr"] Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.833019 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4pbr6"] Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.839625 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:15 crc kubenswrapper[4753]: E1005 20:17:15.840082 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:16.340067698 +0000 UTC m=+145.188395930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.845864 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjp5g" event={"ID":"99413e5b-15a4-40f6-b7a5-2dbee5eafb1d","Type":"ContainerStarted","Data":"3b88c3eacf71a8f5f51a14845f00a0cb80ae242dfa74ebc2bbaa909590d8f2ac"} Oct 05 20:17:15 crc kubenswrapper[4753]: W1005 20:17:15.848442 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c622e2_78c9_42bd_9031_776cded4435c.slice/crio-a6d1f167cb92b234b105f681f9be8b86ec592408a577cdb7b1d4dca4f30eaa18 WatchSource:0}: Error finding container a6d1f167cb92b234b105f681f9be8b86ec592408a577cdb7b1d4dca4f30eaa18: Status 404 returned error can't find the container with id a6d1f167cb92b234b105f681f9be8b86ec592408a577cdb7b1d4dca4f30eaa18 Oct 05 20:17:15 crc kubenswrapper[4753]: W1005 20:17:15.854454 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7da48090_042e_4fef_afdf_9e6e54a89fe2.slice/crio-d3c727280732f3616f9ccc724266390a25df9d8239936a09a9ab955d756c4a4d WatchSource:0}: Error finding container d3c727280732f3616f9ccc724266390a25df9d8239936a09a9ab955d756c4a4d: Status 404 returned error can't find the container with id d3c727280732f3616f9ccc724266390a25df9d8239936a09a9ab955d756c4a4d Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.869285 4753 patch_prober.go:28] interesting pod/downloads-7954f5f757-bnpxs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.869329 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bnpxs" podUID="0cae56c9-6ca6-49f3-97d2-14e8a6748315" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.938174 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q9dxv" event={"ID":"f647e6b6-7d7f-4c72-9506-af98598583fc","Type":"ContainerStarted","Data":"0d4a3e76a0cf4ac2aad0cf20db883419b94b92d298d62fb0d360ea1acd51fa4c"} Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.941029 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.941448 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.941482 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:17:15 crc kubenswrapper[4753]: E1005 20:17:15.947843 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:16.447829415 +0000 UTC m=+145.296157647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:15 crc kubenswrapper[4753]: I1005 20:17:15.997416 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.045210 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:16 crc kubenswrapper[4753]: E1005 20:17:16.045646 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:16.545632749 +0000 UTC m=+145.393960981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.159379 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:16 crc kubenswrapper[4753]: E1005 20:17:16.159700 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:16.659687387 +0000 UTC m=+145.508015619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.265121 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:16 crc kubenswrapper[4753]: E1005 20:17:16.265784 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:16.765771162 +0000 UTC m=+145.614099384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.368129 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:16 crc kubenswrapper[4753]: E1005 20:17:16.368427 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:16.868414153 +0000 UTC m=+145.716742385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.431341 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:16 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:16 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:16 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.431388 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.475894 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:16 crc kubenswrapper[4753]: E1005 20:17:16.476012 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:16.975991865 +0000 UTC m=+145.824320097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.476279 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:16 crc kubenswrapper[4753]: E1005 20:17:16.476629 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:16.976619183 +0000 UTC m=+145.824947415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.578078 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:16 crc kubenswrapper[4753]: E1005 20:17:16.578400 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:17.078384838 +0000 UTC m=+145.926713070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.679120 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:16 crc kubenswrapper[4753]: E1005 20:17:16.679455 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:17.179443012 +0000 UTC m=+146.027771244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.781799 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:16 crc kubenswrapper[4753]: E1005 20:17:16.782147 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:17.282122375 +0000 UTC m=+146.130450607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.885116 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:16 crc kubenswrapper[4753]: E1005 20:17:16.885612 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:17.38559694 +0000 UTC m=+146.233925172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.924798 4753 generic.go:334] "Generic (PLEG): container finished" podID="322f0415-b81e-4f7a-adec-e49a0460844b" containerID="bcbcd8cdc315fd58e1c55c1be84c291c6636809dd351bc1a47f50ba902020a53" exitCode=0 Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.924890 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" event={"ID":"322f0415-b81e-4f7a-adec-e49a0460844b","Type":"ContainerDied","Data":"bcbcd8cdc315fd58e1c55c1be84c291c6636809dd351bc1a47f50ba902020a53"} Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.931480 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf" event={"ID":"8f9cb473-0c3f-4b38-96ab-5d6243a7aa85","Type":"ContainerStarted","Data":"a7510de64e303f96f21def988bdfd625bb78f49f0c09a3083554655a75b4b650"} Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.961487 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" event={"ID":"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd","Type":"ContainerStarted","Data":"8c20ac4884b761adb08e5de6ba374207529d54597ce32e7f70f9dd872d2e7694"} Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.983167 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9wgjd" event={"ID":"170ae05d-7931-4f6b-a1b1-d4cf1c6709ba","Type":"ContainerStarted","Data":"4465373baedbca0de11e00322bd8e03fb724c5f33a21e4783f61c24a963503b7"} Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.986760 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:16 crc kubenswrapper[4753]: E1005 20:17:16.986961 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:17.486935033 +0000 UTC m=+146.335263265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:16 crc kubenswrapper[4753]: I1005 20:17:16.987076 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:16 crc kubenswrapper[4753]: E1005 20:17:16.987802 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:17.487793989 +0000 UTC m=+146.336122221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.015636 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wsrwg" event={"ID":"a19f7732-a06f-4b93-b549-4c0a4ccd23fa","Type":"ContainerStarted","Data":"8cac0be9ec49655ac37158720e2258aeb156d25ccbe7441968830034df63988e"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.016186 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wsrwg" Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.024021 4753 patch_prober.go:28] interesting pod/console-operator-58897d9998-wsrwg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.024102 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wsrwg" podUID="a19f7732-a06f-4b93-b549-4c0a4ccd23fa" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.045150 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-459zr" event={"ID":"2b92548e-588c-43d5-a99e-9fe9558c8526","Type":"ContainerStarted","Data":"bd99ef0bbfdaa3a979b8ef9845db0d4042cfe71a3dc42127666cb18a3e7def8e"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.050992 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-wsrwg" podStartSLOduration=124.050974184 podStartE2EDuration="2m4.050974184s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:17.043631581 +0000 UTC m=+145.891959803" watchObservedRunningTime="2025-10-05 20:17:17.050974184 +0000 UTC m=+145.899302416" Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.080024 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm" event={"ID":"ef7125ff-9b89-4972-954b-61145623ecec","Type":"ContainerStarted","Data":"7a16e1bc1767185b1dcbe03e80298a3c921c9f2de000b526841c9ea6fd677a61"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.100717 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:17 crc kubenswrapper[4753]: E1005 20:17:17.101008 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:17.60098509 +0000 UTC m=+146.449313322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.101214 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:17 crc kubenswrapper[4753]: E1005 20:17:17.102214 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:17.602202277 +0000 UTC m=+146.450530509 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.106840 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cwjwm" podStartSLOduration=124.106825807 podStartE2EDuration="2m4.106825807s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:17.106540378 +0000 UTC m=+145.954868610" watchObservedRunningTime="2025-10-05 20:17:17.106825807 +0000 UTC m=+145.955154039" Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.138308 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" event={"ID":"53527faf-a0ea-4edf-9cfa-74ae88f1b3cb","Type":"ContainerStarted","Data":"7c74bd036fa778d698bc5d9197f907fd9b9c752b1c7b4e54b3047551f59a6be1"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.139983 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4pbr6" event={"ID":"d3383e66-50dd-4d8a-a24f-faa83ad90022","Type":"ContainerStarted","Data":"f864cc98de42bcbeb538227137c1a3e3a92a4901ae79bea2e62ee3a757737208"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.157902 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-d78vr" event={"ID":"0992c967-65e9-45d8-abe0-80fdb0f4fbcf","Type":"ContainerStarted","Data":"73ceca1925421355b567396139f4d0a56c51e551389889e4de1ab2418c05ab2d"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.179219 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gx875" event={"ID":"834546fa-9104-4394-a674-e0350de62fb1","Type":"ContainerStarted","Data":"f9ed716ca58adfc1bd9841ddb4c9df3099480492ffee6e7c686b365710e93e94"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.187079 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zbdx5" event={"ID":"e8855a1b-3e62-4ed3-acb0-eb0663d8df01","Type":"ContainerStarted","Data":"4ae3e0ae38bc00a87ff240ac3a5fa3a5e3abc5b5fdeb9adea67275b068ade001"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.188275 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-rwrj7" podStartSLOduration=124.188263185 podStartE2EDuration="2m4.188263185s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:17.186664387 +0000 UTC m=+146.034992619" watchObservedRunningTime="2025-10-05 20:17:17.188263185 +0000 UTC m=+146.036591417" Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.202733 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:17 crc kubenswrapper[4753]: E1005 20:17:17.203989 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:17.703972722 +0000 UTC m=+146.552300954 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.228387 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9" event={"ID":"86e1bede-355f-425d-b6c8-b300f7addf32","Type":"ContainerStarted","Data":"768f220c4f2ca2411a0437058a353082fe2b02adeddaa4ee2adbdf943a071315"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.244111 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-d78vr" podStartSLOduration=7.244095969 podStartE2EDuration="7.244095969s" podCreationTimestamp="2025-10-05 20:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:17.242993925 +0000 UTC m=+146.091322157" watchObservedRunningTime="2025-10-05 20:17:17.244095969 +0000 UTC m=+146.092424201" Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.257092 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk" event={"ID":"1f3b053c-4395-46db-9d64-094d422f8145","Type":"ContainerStarted","Data":"a1acae6c7b3aeddf3ff985104bb13bd198ecdd888a69e49b55b9b3d4600fc0a9"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.285307 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6jr9" podStartSLOduration=124.285294937 podStartE2EDuration="2m4.285294937s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:17.2843882 +0000 UTC m=+146.132716432" watchObservedRunningTime="2025-10-05 20:17:17.285294937 +0000 UTC m=+146.133623169" Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.289518 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" event={"ID":"43435d4b-6820-4794-a431-3946e0486b20","Type":"ContainerStarted","Data":"8d8a54470272d33acd3ade5656c461cc164582b30a4e7b435951cac6a17c97a9"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.309966 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk" event={"ID":"5b5d8bc9-c226-4476-a309-fb21a5f79af3","Type":"ContainerStarted","Data":"da99bbd339628642afcc05d27c93ec69a733ee4ae125833c2d65758600ca2aaf"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.311127 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:17 crc kubenswrapper[4753]: E1005 20:17:17.312169 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:17.812158451 +0000 UTC m=+146.660486683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.333999 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h7rzp" event={"ID":"7da48090-042e-4fef-afdf-9e6e54a89fe2","Type":"ContainerStarted","Data":"d3c727280732f3616f9ccc724266390a25df9d8239936a09a9ab955d756c4a4d"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.356243 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lkdpm" event={"ID":"b4f7a5a6-df08-4cdb-8d5f-8d62721ab8b5","Type":"ContainerStarted","Data":"4bfa132095a1a7ccbea0183cd27fb09244f328917e9391a14293603489b6c199"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.368819 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" event={"ID":"3a0f0082-3cf0-4339-9964-02916c137f45","Type":"ContainerStarted","Data":"066dadcfd6c67eca1c01daf556b7773f5f10e66ad5f63e5762271467181d50bf"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.371102 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" event={"ID":"e42464d4-5bc2-41b9-b28c-0dae67783433","Type":"ContainerStarted","Data":"a89f652b9b3b803b60f3aab2cd3e179218f2499b604577933ee9e42ad3e303a8"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.373949 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.374993 4753 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-2fqws container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.375023 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" podUID="e42464d4-5bc2-41b9-b28c-0dae67783433" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.391611 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" event={"ID":"74a4246e-6a4f-4e6b-9562-a3746d802003","Type":"ContainerStarted","Data":"abeb885139e2867c9c092197e0859a6c7764c04e3f4a21435bf5b9d520dcb458"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.393563 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" event={"ID":"a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5","Type":"ContainerStarted","Data":"790ae0938bdd12ded5fe1ad2bcf9179fb33bb8ecec86d675e333b06a538bfd20"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.394051 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.412799 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9h4dc" podStartSLOduration=124.412785891 podStartE2EDuration="2m4.412785891s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:17.410346578 +0000 UTC m=+146.258674810" watchObservedRunningTime="2025-10-05 20:17:17.412785891 +0000 UTC m=+146.261114123" Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.412880 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59prk" podStartSLOduration=124.412876324 podStartE2EDuration="2m4.412876324s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:17.345793061 +0000 UTC m=+146.194121293" watchObservedRunningTime="2025-10-05 20:17:17.412876324 +0000 UTC m=+146.261204556" Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.413659 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:17 crc kubenswrapper[4753]: E1005 20:17:17.414801 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:17.914785132 +0000 UTC m=+146.763113364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.431305 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:17 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:17 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:17 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.431365 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.444790 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" event={"ID":"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8","Type":"ContainerStarted","Data":"c146e1754f54e0666d91bb774225ce944fe50fd85fc34921b29031b6141bc150"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.453179 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm" event={"ID":"0487fb96-92f2-462f-a454-528377db3dd4","Type":"ContainerStarted","Data":"6ad95091d722062f5885291a132c12fd93ff11dcaa70f440ceb05faba121d48f"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.453411 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm" event={"ID":"0487fb96-92f2-462f-a454-528377db3dd4","Type":"ContainerStarted","Data":"39e6348ba82af5111bc699b18534c235ff6605ab5dcbe3c536dc91893bc6590d"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.454586 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" event={"ID":"10c622e2-78c9-42bd-9031-776cded4435c","Type":"ContainerStarted","Data":"a6d1f167cb92b234b105f681f9be8b86ec592408a577cdb7b1d4dca4f30eaa18"} Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.455370 4753 patch_prober.go:28] interesting pod/downloads-7954f5f757-bnpxs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.455414 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bnpxs" podUID="0cae56c9-6ca6-49f3-97d2-14e8a6748315" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.464321 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" podStartSLOduration=123.464308594 podStartE2EDuration="2m3.464308594s" podCreationTimestamp="2025-10-05 20:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:17.463398216 +0000 UTC m=+146.311726448" watchObservedRunningTime="2025-10-05 20:17:17.464308594 +0000 UTC m=+146.312636826" Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.514782 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:17 crc kubenswrapper[4753]: E1005 20:17:17.518062 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:18.018050423 +0000 UTC m=+146.866378655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.577811 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" podStartSLOduration=124.577785393 podStartE2EDuration="2m4.577785393s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:17.534318876 +0000 UTC m=+146.382647108" watchObservedRunningTime="2025-10-05 20:17:17.577785393 +0000 UTC m=+146.426113625" Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.617591 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:17 crc kubenswrapper[4753]: E1005 20:17:17.618005 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:18.117987122 +0000 UTC m=+146.966315354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.720448 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:17 crc kubenswrapper[4753]: E1005 20:17:17.720961 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:18.220944903 +0000 UTC m=+147.069273135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.821398 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:17 crc kubenswrapper[4753]: E1005 20:17:17.821689 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:18.321647136 +0000 UTC m=+147.169975368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.821942 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:17 crc kubenswrapper[4753]: E1005 20:17:17.822419 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:18.322393838 +0000 UTC m=+147.170722070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.923246 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:17 crc kubenswrapper[4753]: E1005 20:17:17.923461 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:18.4234166 +0000 UTC m=+147.271744822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:17 crc kubenswrapper[4753]: I1005 20:17:17.923583 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:17 crc kubenswrapper[4753]: E1005 20:17:17.924233 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:18.424224255 +0000 UTC m=+147.272552487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.025376 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:18 crc kubenswrapper[4753]: E1005 20:17:18.026470 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:18.526436154 +0000 UTC m=+147.374764376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.127253 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:18 crc kubenswrapper[4753]: E1005 20:17:18.127634 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:18.62762316 +0000 UTC m=+147.475951382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.229403 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:18 crc kubenswrapper[4753]: E1005 20:17:18.229594 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:18.72956213 +0000 UTC m=+147.577890362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.230023 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:18 crc kubenswrapper[4753]: E1005 20:17:18.230442 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:18.730431067 +0000 UTC m=+147.578759299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.331702 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:18 crc kubenswrapper[4753]: E1005 20:17:18.331911 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:18.831872812 +0000 UTC m=+147.680201044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.332240 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:18 crc kubenswrapper[4753]: E1005 20:17:18.332695 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:18.832686827 +0000 UTC m=+147.681015059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.431095 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:18 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:18 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:18 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.431192 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.434574 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:18 crc kubenswrapper[4753]: E1005 20:17:18.435058 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:18.93503481 +0000 UTC m=+147.783363042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.461921 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" event={"ID":"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8","Type":"ContainerStarted","Data":"c5c24b3d73e3350769a1d3eef12c5f5690d423d1890f51d97720650677f0e576"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.463292 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" event={"ID":"e42464d4-5bc2-41b9-b28c-0dae67783433","Type":"ContainerStarted","Data":"0d615cb5f0f8e5dc331f5aef23ef7cfb971f8a9f08df6ce2e2c49c3de1980d43"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.464296 4753 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-2fqws container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.464358 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" podUID="e42464d4-5bc2-41b9-b28c-0dae67783433" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.464999 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4pbr6" event={"ID":"d3383e66-50dd-4d8a-a24f-faa83ad90022","Type":"ContainerStarted","Data":"c835a6e0082de28f17085afc6d625b39c7d9f6f539347c057657387c4d9d8831"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.466751 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" event={"ID":"10c622e2-78c9-42bd-9031-776cded4435c","Type":"ContainerStarted","Data":"a12c5e14e454f061e96cbde4ee73a98879889ea378e7bd61c2821f3a9c3f6b8f"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.468595 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-nfxm6" event={"ID":"8ce14c23-57a1-469c-b46c-7b57907dc234","Type":"ContainerStarted","Data":"d1a14130145de03f13da7ca9c0963c8aa1fbf765135a8b6156254679b1b2a1e3"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.468646 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-nfxm6" event={"ID":"8ce14c23-57a1-469c-b46c-7b57907dc234","Type":"ContainerStarted","Data":"ec14f98f75cdfd78f7e9196d472b041fdbdd4f8060c987f52165bc8f3654fa43"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.470960 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk" event={"ID":"1f3b053c-4395-46db-9d64-094d422f8145","Type":"ContainerStarted","Data":"00e998b258a5bbde104520178ab8917ecb1859aaa1361a3d0b7d33bc28ce2e34"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.478880 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf" event={"ID":"8f9cb473-0c3f-4b38-96ab-5d6243a7aa85","Type":"ContainerStarted","Data":"ec2dd0990d5f0df7ac5090ab61d9273ce529d276efe877dcbf7e012f16dd5f87"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.478952 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf" event={"ID":"8f9cb473-0c3f-4b38-96ab-5d6243a7aa85","Type":"ContainerStarted","Data":"264359e69a96ae168952e5c9d3807caf1e9e5b9b8bfe936835e6ede52acc9e95"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.479873 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf" Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.490035 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9wgjd" event={"ID":"170ae05d-7931-4f6b-a1b1-d4cf1c6709ba","Type":"ContainerStarted","Data":"7258b892f5fcc8850d77f388eea9935d79dad76004d3cc6775369c5dd8a176de"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.496515 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q9dxv" event={"ID":"f647e6b6-7d7f-4c72-9506-af98598583fc","Type":"ContainerStarted","Data":"c95348c353f8e6ad13b57c1f49a18452ee5bb4b55c18ac523351c2be8b82a05b"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.517306 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gx875" event={"ID":"834546fa-9104-4394-a674-e0350de62fb1","Type":"ContainerStarted","Data":"27f82debeab5b09bd56f19ef600a5745d895a5a0817a061216b7ac8a45d3b5f2"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.518303 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gx875" Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.525206 4753 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gx875 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.525344 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gx875" podUID="834546fa-9104-4394-a674-e0350de62fb1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.526471 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-4pbr6" podStartSLOduration=124.52644983 podStartE2EDuration="2m4.52644983s" podCreationTimestamp="2025-10-05 20:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:18.525261214 +0000 UTC m=+147.373589446" watchObservedRunningTime="2025-10-05 20:17:18.52644983 +0000 UTC m=+147.374778062" Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.537159 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:18 crc kubenswrapper[4753]: E1005 20:17:18.539959 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.039942689 +0000 UTC m=+147.888270921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.541334 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-459zr" event={"ID":"2b92548e-588c-43d5-a99e-9fe9558c8526","Type":"ContainerStarted","Data":"92f341013030af8fe1f97dbdcf91a1d206303652a931a6bbb980c1993cf29c06"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.541369 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-459zr" event={"ID":"2b92548e-588c-43d5-a99e-9fe9558c8526","Type":"ContainerStarted","Data":"b49cb5053d9e0d100e67d3f68fedabde40369596a87f9da64ff258155d1a6783"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.541382 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-459zr" Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.554528 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h7rzp" event={"ID":"7da48090-042e-4fef-afdf-9e6e54a89fe2","Type":"ContainerStarted","Data":"128076792bfc030e5c3d3d8bae2673418c0b0abae0760cd91b275c083b6017a0"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.565600 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" event={"ID":"74a4246e-6a4f-4e6b-9562-a3746d802003","Type":"ContainerStarted","Data":"942f7a13267bbd997dfb1621d42e3aacbeb6749b031383d2c9eeb3d7cca61aee"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.566598 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.569229 4753 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-xkst4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.569265 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" podUID="74a4246e-6a4f-4e6b-9562-a3746d802003" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.575174 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" event={"ID":"43435d4b-6820-4794-a431-3946e0486b20","Type":"ContainerStarted","Data":"fdd5afea0aa0a28c7aec587d0cf49d445181730eb749ffbba950c7fd28274fe8"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.575604 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.576963 4753 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rrzxd container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.577010 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" podUID="43435d4b-6820-4794-a431-3946e0486b20" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.583115 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" event={"ID":"9a13243e-13d5-4a22-9417-8dfc3896f332","Type":"ContainerStarted","Data":"112b029c04d5d69f62364dc89828c96d59e31f1b924a3fc217175506c24e3929"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.595826 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm" event={"ID":"0487fb96-92f2-462f-a454-528377db3dd4","Type":"ContainerStarted","Data":"49c1a0f37430e5b0e3011013552855ecbbf939dd799f332a34d817fbc4989641"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.609033 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" event={"ID":"322f0415-b81e-4f7a-adec-e49a0460844b","Type":"ContainerStarted","Data":"f855df435f7441b01c443db2760aaf4ebbfcbc44ad06e304ab7d08b239faf618"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.629207 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" event={"ID":"ad58fa0b-b61d-4afa-bf14-f82b2b976bdd","Type":"ContainerStarted","Data":"fc970501d5a58782229f213878f3f7d3e0e719534102c94bb8dea9ec4f028672"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.639676 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:18 crc kubenswrapper[4753]: E1005 20:17:18.639896 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.139871758 +0000 UTC m=+147.988199990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.641165 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:18 crc kubenswrapper[4753]: E1005 20:17:18.642752 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.142732935 +0000 UTC m=+147.991061167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.651277 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjp5g" event={"ID":"99413e5b-15a4-40f6-b7a5-2dbee5eafb1d","Type":"ContainerStarted","Data":"5fdac560b69e07c0abdcdd3d1dc48b4539425887715a599f4774281cf4e4a326"} Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.743822 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:18 crc kubenswrapper[4753]: E1005 20:17:18.745434 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.245418798 +0000 UTC m=+148.093747030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.746171 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.763407 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf" podStartSLOduration=124.763388763 podStartE2EDuration="2m4.763388763s" podCreationTimestamp="2025-10-05 20:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:18.606064144 +0000 UTC m=+147.454392376" watchObservedRunningTime="2025-10-05 20:17:18.763388763 +0000 UTC m=+147.611716995" Oct 05 20:17:18 crc kubenswrapper[4753]: E1005 20:17:18.771488 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.271471467 +0000 UTC m=+148.119799699 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.772969 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gx875" podStartSLOduration=124.772945122 podStartE2EDuration="2m4.772945122s" podCreationTimestamp="2025-10-05 20:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:18.755609267 +0000 UTC m=+147.603937499" watchObservedRunningTime="2025-10-05 20:17:18.772945122 +0000 UTC m=+147.621273364" Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.847866 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:18 crc kubenswrapper[4753]: E1005 20:17:18.848089 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.348049229 +0000 UTC m=+148.196377461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.848625 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:18 crc kubenswrapper[4753]: E1005 20:17:18.849003 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.348995768 +0000 UTC m=+148.197324000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.942957 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-nfxm6" podStartSLOduration=124.942935795 podStartE2EDuration="2m4.942935795s" podCreationTimestamp="2025-10-05 20:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:18.83918694 +0000 UTC m=+147.687515162" watchObservedRunningTime="2025-10-05 20:17:18.942935795 +0000 UTC m=+147.791264027" Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.949926 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:18 crc kubenswrapper[4753]: E1005 20:17:18.950173 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.450130833 +0000 UTC m=+148.298459065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.950298 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:18 crc kubenswrapper[4753]: E1005 20:17:18.950775 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.450756932 +0000 UTC m=+148.299085164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:18 crc kubenswrapper[4753]: I1005 20:17:18.997183 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" podStartSLOduration=125.997166339 podStartE2EDuration="2m5.997166339s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:18.94374518 +0000 UTC m=+147.792073412" watchObservedRunningTime="2025-10-05 20:17:18.997166339 +0000 UTC m=+147.845494561" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.009693 4753 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-dxgld container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.009772 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" podUID="a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.009761 4753 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-dxgld container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.010045 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" podUID="a414d9b0-4b87-4ecc-ae67-f9f38c33a3f5" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.051112 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.051329 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.55129738 +0000 UTC m=+148.399625612 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.051463 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.052020 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.552004752 +0000 UTC m=+148.400332984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.102419 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n7kdk" podStartSLOduration=125.102399979 podStartE2EDuration="2m5.102399979s" podCreationTimestamp="2025-10-05 20:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:19.101425559 +0000 UTC m=+147.949753791" watchObservedRunningTime="2025-10-05 20:17:19.102399979 +0000 UTC m=+147.950728211" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.103791 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-q9dxv" podStartSLOduration=126.103785881 podStartE2EDuration="2m6.103785881s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:18.999579812 +0000 UTC m=+147.847908044" watchObservedRunningTime="2025-10-05 20:17:19.103785881 +0000 UTC m=+147.952114113" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.152073 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.152345 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.652330803 +0000 UTC m=+148.500659035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.189588 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dxgld" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.253692 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.254120 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.754093218 +0000 UTC m=+148.602421440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.292494 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9wgjd" podStartSLOduration=9.29246647 podStartE2EDuration="9.29246647s" podCreationTimestamp="2025-10-05 20:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:19.220612782 +0000 UTC m=+148.068941024" watchObservedRunningTime="2025-10-05 20:17:19.29246647 +0000 UTC m=+148.140794702" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.293861 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" podStartSLOduration=125.293854812 podStartE2EDuration="2m5.293854812s" podCreationTimestamp="2025-10-05 20:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:19.293540933 +0000 UTC m=+148.141869165" watchObservedRunningTime="2025-10-05 20:17:19.293854812 +0000 UTC m=+148.142183044" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.354371 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.354532 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.854510221 +0000 UTC m=+148.702838443 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.354949 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.355310 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.855294365 +0000 UTC m=+148.703622597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.408819 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xjp5g" podStartSLOduration=125.408803027 podStartE2EDuration="2m5.408803027s" podCreationTimestamp="2025-10-05 20:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:19.356480771 +0000 UTC m=+148.204809003" watchObservedRunningTime="2025-10-05 20:17:19.408803027 +0000 UTC m=+148.257131259" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.429908 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:19 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:19 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:19 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.429976 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.455866 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.456037 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.956006958 +0000 UTC m=+148.804335200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.456114 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.456898 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:19.956859794 +0000 UTC m=+148.805188026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.461251 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.461844 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 05 20:17:19 crc kubenswrapper[4753]: W1005 20:17:19.467067 4753 reflector.go:561] object-"openshift-kube-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-kube-controller-manager": no relationship found between node 'crc' and this object Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.467111 4753 reflector.go:158] "Unhandled Error" err="object-\"openshift-kube-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-kube-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 05 20:17:19 crc kubenswrapper[4753]: W1005 20:17:19.467166 4753 reflector.go:561] object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n": failed to list *v1.Secret: secrets "installer-sa-dockercfg-kjl2n" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-kube-controller-manager": no relationship found between node 'crc' and this object Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.467180 4753 reflector.go:158] "Unhandled Error" err="object-\"openshift-kube-controller-manager\"/\"installer-sa-dockercfg-kjl2n\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"installer-sa-dockercfg-kjl2n\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-kube-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.469558 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6ntx" podStartSLOduration=125.469520098 podStartE2EDuration="2m5.469520098s" podCreationTimestamp="2025-10-05 20:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:19.459073531 +0000 UTC m=+148.307402033" watchObservedRunningTime="2025-10-05 20:17:19.469520098 +0000 UTC m=+148.317848330" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.519533 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.520198 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" podStartSLOduration=125.520182633 podStartE2EDuration="2m5.520182633s" podCreationTimestamp="2025-10-05 20:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:19.496110974 +0000 UTC m=+148.344439206" watchObservedRunningTime="2025-10-05 20:17:19.520182633 +0000 UTC m=+148.368510865" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.554547 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" podStartSLOduration=126.554512464 podStartE2EDuration="2m6.554512464s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:19.552257556 +0000 UTC m=+148.400585798" watchObservedRunningTime="2025-10-05 20:17:19.554512464 +0000 UTC m=+148.402840696" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.557546 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.557658 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8345c24a-2388-42cf-b5fc-6ba7a4825f04-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8345c24a-2388-42cf-b5fc-6ba7a4825f04\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.557777 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:20.057746592 +0000 UTC m=+148.906074824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.557946 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8345c24a-2388-42cf-b5fc-6ba7a4825f04-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8345c24a-2388-42cf-b5fc-6ba7a4825f04\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.558187 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.558488 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:20.058480634 +0000 UTC m=+148.906808866 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.664304 4753 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rrzxd container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.664374 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" podUID="43435d4b-6820-4794-a431-3946e0486b20" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.664412 4753 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gx875 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.664483 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gx875" podUID="834546fa-9104-4394-a674-e0350de62fb1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.664800 4753 patch_prober.go:28] interesting pod/console-operator-58897d9998-wsrwg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.664830 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wsrwg" podUID="a19f7732-a06f-4b93-b549-4c0a4ccd23fa" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.665794 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.667559 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:20.16754099 +0000 UTC m=+149.015869222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.667904 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8345c24a-2388-42cf-b5fc-6ba7a4825f04-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8345c24a-2388-42cf-b5fc-6ba7a4825f04\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.668067 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8345c24a-2388-42cf-b5fc-6ba7a4825f04-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8345c24a-2388-42cf-b5fc-6ba7a4825f04\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.668378 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.668477 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.672521 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:20.17250124 +0000 UTC m=+149.020829472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.672688 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8345c24a-2388-42cf-b5fc-6ba7a4825f04-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8345c24a-2388-42cf-b5fc-6ba7a4825f04\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.681950 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.707435 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" podStartSLOduration=125.707417078 podStartE2EDuration="2m5.707417078s" podCreationTimestamp="2025-10-05 20:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:19.628012932 +0000 UTC m=+148.476341154" watchObservedRunningTime="2025-10-05 20:17:19.707417078 +0000 UTC m=+148.555745310" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.728702 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2fqws" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.758787 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-459zr" podStartSLOduration=9.758771236 podStartE2EDuration="9.758771236s" podCreationTimestamp="2025-10-05 20:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:19.757868559 +0000 UTC m=+148.606196791" watchObservedRunningTime="2025-10-05 20:17:19.758771236 +0000 UTC m=+148.607099468" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.759753 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8qwsm" podStartSLOduration=126.759748425 podStartE2EDuration="2m6.759748425s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:19.721000791 +0000 UTC m=+148.569329023" watchObservedRunningTime="2025-10-05 20:17:19.759748425 +0000 UTC m=+148.608076657" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.769823 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.770094 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.770194 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.770219 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.772133 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:20.27211235 +0000 UTC m=+149.120440582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.774168 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.776257 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.814531 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.815763 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.816950 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.828289 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h7rzp" podStartSLOduration=125.828270632 podStartE2EDuration="2m5.828270632s" podCreationTimestamp="2025-10-05 20:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:19.817787495 +0000 UTC m=+148.666115727" watchObservedRunningTime="2025-10-05 20:17:19.828270632 +0000 UTC m=+148.676598864" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.832613 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.873451 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.873779 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:20.373767572 +0000 UTC m=+149.222095804 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:19 crc kubenswrapper[4753]: I1005 20:17:19.975462 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:19 crc kubenswrapper[4753]: E1005 20:17:19.975848 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:20.475831885 +0000 UTC m=+149.324160117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.093098 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:20 crc kubenswrapper[4753]: E1005 20:17:20.093413 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:20.593401739 +0000 UTC m=+149.441729971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.193853 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:20 crc kubenswrapper[4753]: E1005 20:17:20.194168 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:20.694152543 +0000 UTC m=+149.542480775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.294965 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:20 crc kubenswrapper[4753]: E1005 20:17:20.295530 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:20.795510535 +0000 UTC m=+149.643838767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.395912 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:20 crc kubenswrapper[4753]: E1005 20:17:20.396574 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:20.896556399 +0000 UTC m=+149.744884631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.420610 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-wsrwg" Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.442993 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:20 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:20 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:20 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.443045 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.500530 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:20 crc kubenswrapper[4753]: E1005 20:17:20.500829 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:21.000817269 +0000 UTC m=+149.849145501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.601802 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:20 crc kubenswrapper[4753]: E1005 20:17:20.603300 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:21.102150951 +0000 UTC m=+149.950479183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.661242 4753 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-xkst4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.661564 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" podUID="74a4246e-6a4f-4e6b-9562-a3746d802003" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.670221 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" event={"ID":"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8","Type":"ContainerStarted","Data":"32a388bfba3d4dbba9493a767bec4f828ea777a1a5e27fbe0e53bbfda0f7ece2"} Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.670300 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" event={"ID":"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8","Type":"ContainerStarted","Data":"c8605eed16453b2553e84675ca450f0fcd5714c708bc0797d1c3a25da32cbf51"} Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.670841 4753 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gx875 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.670900 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gx875" podUID="834546fa-9104-4394-a674-e0350de62fb1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.705453 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:20 crc kubenswrapper[4753]: E1005 20:17:20.707039 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:21.20702563 +0000 UTC m=+150.055353852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:20 crc kubenswrapper[4753]: E1005 20:17:20.708235 4753 projected.go:288] Couldn't get configMap openshift-kube-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 05 20:17:20 crc kubenswrapper[4753]: E1005 20:17:20.708253 4753 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager/revision-pruner-9-crc: failed to sync configmap cache: timed out waiting for the condition Oct 05 20:17:20 crc kubenswrapper[4753]: E1005 20:17:20.708283 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8345c24a-2388-42cf-b5fc-6ba7a4825f04-kube-api-access podName:8345c24a-2388-42cf-b5fc-6ba7a4825f04 nodeName:}" failed. No retries permitted until 2025-10-05 20:17:21.208274058 +0000 UTC m=+150.056602290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/8345c24a-2388-42cf-b5fc-6ba7a4825f04-kube-api-access") pod "revision-pruner-9-crc" (UID: "8345c24a-2388-42cf-b5fc-6ba7a4825f04") : failed to sync configmap cache: timed out waiting for the condition Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.778387 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.784557 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrzxd" Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.809965 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:20 crc kubenswrapper[4753]: E1005 20:17:20.810064 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:21.310046833 +0000 UTC m=+150.158375065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.810268 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:20 crc kubenswrapper[4753]: E1005 20:17:20.810822 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:21.310815137 +0000 UTC m=+150.159143369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.911108 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:20 crc kubenswrapper[4753]: E1005 20:17:20.911391 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:21.411376045 +0000 UTC m=+150.259704277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:20 crc kubenswrapper[4753]: I1005 20:17:20.984596 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.020717 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:21 crc kubenswrapper[4753]: E1005 20:17:21.021033 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:21.521019838 +0000 UTC m=+150.369348060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.122047 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:21 crc kubenswrapper[4753]: E1005 20:17:21.122335 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:21.622320789 +0000 UTC m=+150.470649021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.224155 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8345c24a-2388-42cf-b5fc-6ba7a4825f04-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8345c24a-2388-42cf-b5fc-6ba7a4825f04\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.224283 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:21 crc kubenswrapper[4753]: E1005 20:17:21.224662 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:21.72462816 +0000 UTC m=+150.572956392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.232087 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8345c24a-2388-42cf-b5fc-6ba7a4825f04-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8345c24a-2388-42cf-b5fc-6ba7a4825f04\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.281741 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.324990 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:21 crc kubenswrapper[4753]: E1005 20:17:21.325343 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:21.825318413 +0000 UTC m=+150.673646645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.426928 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:21 crc kubenswrapper[4753]: E1005 20:17:21.427278 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:21.927266533 +0000 UTC m=+150.775594765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.438322 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:21 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:21 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:21 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.438396 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.456200 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vm5zp"] Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.457101 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.460555 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.528626 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:21 crc kubenswrapper[4753]: E1005 20:17:21.529298 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:22.029282875 +0000 UTC m=+150.877611107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.549500 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vm5zp"] Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.600122 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hldsk"] Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.600984 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.609266 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.623939 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hldsk"] Oct 05 20:17:21 crc kubenswrapper[4753]: W1005 20:17:21.629515 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-827760172e1d6a484d9f833caaac5da2b0c1126fc5f1414a68ffbf869577f26e WatchSource:0}: Error finding container 827760172e1d6a484d9f833caaac5da2b0c1126fc5f1414a68ffbf869577f26e: Status 404 returned error can't find the container with id 827760172e1d6a484d9f833caaac5da2b0c1126fc5f1414a68ffbf869577f26e Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.630165 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhz65\" (UniqueName: \"kubernetes.io/projected/846a9f6b-930c-4527-9b57-c8ca70b345d0-kube-api-access-qhz65\") pod \"certified-operators-vm5zp\" (UID: \"846a9f6b-930c-4527-9b57-c8ca70b345d0\") " pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.630199 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/846a9f6b-930c-4527-9b57-c8ca70b345d0-utilities\") pod \"certified-operators-vm5zp\" (UID: \"846a9f6b-930c-4527-9b57-c8ca70b345d0\") " pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.630329 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/846a9f6b-930c-4527-9b57-c8ca70b345d0-catalog-content\") pod \"certified-operators-vm5zp\" (UID: \"846a9f6b-930c-4527-9b57-c8ca70b345d0\") " pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.630485 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:21 crc kubenswrapper[4753]: E1005 20:17:21.630833 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:22.130819393 +0000 UTC m=+150.979147615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.659114 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xkst4" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.731541 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.731806 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/befdf821-5bbf-4606-970d-ed88cf79993d-utilities\") pod \"community-operators-hldsk\" (UID: \"befdf821-5bbf-4606-970d-ed88cf79993d\") " pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.731841 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/befdf821-5bbf-4606-970d-ed88cf79993d-catalog-content\") pod \"community-operators-hldsk\" (UID: \"befdf821-5bbf-4606-970d-ed88cf79993d\") " pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.731893 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhz65\" (UniqueName: \"kubernetes.io/projected/846a9f6b-930c-4527-9b57-c8ca70b345d0-kube-api-access-qhz65\") pod \"certified-operators-vm5zp\" (UID: \"846a9f6b-930c-4527-9b57-c8ca70b345d0\") " pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.731918 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/846a9f6b-930c-4527-9b57-c8ca70b345d0-utilities\") pod \"certified-operators-vm5zp\" (UID: \"846a9f6b-930c-4527-9b57-c8ca70b345d0\") " pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.731944 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mrvv\" (UniqueName: \"kubernetes.io/projected/befdf821-5bbf-4606-970d-ed88cf79993d-kube-api-access-6mrvv\") pod \"community-operators-hldsk\" (UID: \"befdf821-5bbf-4606-970d-ed88cf79993d\") " pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.731972 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/846a9f6b-930c-4527-9b57-c8ca70b345d0-catalog-content\") pod \"certified-operators-vm5zp\" (UID: \"846a9f6b-930c-4527-9b57-c8ca70b345d0\") " pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.733283 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"827760172e1d6a484d9f833caaac5da2b0c1126fc5f1414a68ffbf869577f26e"} Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.742257 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/846a9f6b-930c-4527-9b57-c8ca70b345d0-utilities\") pod \"certified-operators-vm5zp\" (UID: \"846a9f6b-930c-4527-9b57-c8ca70b345d0\") " pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:17:21 crc kubenswrapper[4753]: E1005 20:17:21.743154 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:22.243121767 +0000 UTC m=+151.091449999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.746474 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/846a9f6b-930c-4527-9b57-c8ca70b345d0-catalog-content\") pod \"certified-operators-vm5zp\" (UID: \"846a9f6b-930c-4527-9b57-c8ca70b345d0\") " pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.771475 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ff3c01c576854bdd8e2ff54ddfe6a3297944c9672d19bf1a1dc026da39d4e951"} Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.801563 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhz65\" (UniqueName: \"kubernetes.io/projected/846a9f6b-930c-4527-9b57-c8ca70b345d0-kube-api-access-qhz65\") pod \"certified-operators-vm5zp\" (UID: \"846a9f6b-930c-4527-9b57-c8ca70b345d0\") " pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.810451 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-56tnl"] Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.816641 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" event={"ID":"33b1b99b-4b3a-4d76-91ae-5bf3c0e12fd8","Type":"ContainerStarted","Data":"bb646ed05add097f322d095f185493737b0708b640798ff643b1a6e78a6f328c"} Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.816689 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"df167609df6bcdda4738af968f8a34275f6f114e875ea8bb2b77af0cc93145d7"} Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.816792 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.832974 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.833018 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/befdf821-5bbf-4606-970d-ed88cf79993d-utilities\") pod \"community-operators-hldsk\" (UID: \"befdf821-5bbf-4606-970d-ed88cf79993d\") " pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.833057 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/befdf821-5bbf-4606-970d-ed88cf79993d-catalog-content\") pod \"community-operators-hldsk\" (UID: \"befdf821-5bbf-4606-970d-ed88cf79993d\") " pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.833085 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mrvv\" (UniqueName: \"kubernetes.io/projected/befdf821-5bbf-4606-970d-ed88cf79993d-kube-api-access-6mrvv\") pod \"community-operators-hldsk\" (UID: \"befdf821-5bbf-4606-970d-ed88cf79993d\") " pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:17:21 crc kubenswrapper[4753]: E1005 20:17:21.833662 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:22.333651871 +0000 UTC m=+151.181980103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.834118 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/befdf821-5bbf-4606-970d-ed88cf79993d-utilities\") pod \"community-operators-hldsk\" (UID: \"befdf821-5bbf-4606-970d-ed88cf79993d\") " pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.834336 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/befdf821-5bbf-4606-970d-ed88cf79993d-catalog-content\") pod \"community-operators-hldsk\" (UID: \"befdf821-5bbf-4606-970d-ed88cf79993d\") " pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.836302 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-56tnl"] Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.889938 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mrvv\" (UniqueName: \"kubernetes.io/projected/befdf821-5bbf-4606-970d-ed88cf79993d-kube-api-access-6mrvv\") pod \"community-operators-hldsk\" (UID: \"befdf821-5bbf-4606-970d-ed88cf79993d\") " pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.912225 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-w2sqj" podStartSLOduration=11.912209693 podStartE2EDuration="11.912209693s" podCreationTimestamp="2025-10-05 20:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:21.858281368 +0000 UTC m=+150.706609600" watchObservedRunningTime="2025-10-05 20:17:21.912209693 +0000 UTC m=+150.760537925" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.944570 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.945002 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ded39b3c-186e-4798-a5e3-eefcef9ebd41-catalog-content\") pod \"certified-operators-56tnl\" (UID: \"ded39b3c-186e-4798-a5e3-eefcef9ebd41\") " pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.945128 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ded39b3c-186e-4798-a5e3-eefcef9ebd41-utilities\") pod \"certified-operators-56tnl\" (UID: \"ded39b3c-186e-4798-a5e3-eefcef9ebd41\") " pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.945203 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l85x\" (UniqueName: \"kubernetes.io/projected/ded39b3c-186e-4798-a5e3-eefcef9ebd41-kube-api-access-7l85x\") pod \"certified-operators-56tnl\" (UID: \"ded39b3c-186e-4798-a5e3-eefcef9ebd41\") " pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:17:21 crc kubenswrapper[4753]: E1005 20:17:21.945854 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:22.445838822 +0000 UTC m=+151.294167054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.947556 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.996251 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mtszk"] Oct 05 20:17:21 crc kubenswrapper[4753]: I1005 20:17:21.997241 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.031123 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mtszk"] Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.047369 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ded39b3c-186e-4798-a5e3-eefcef9ebd41-catalog-content\") pod \"certified-operators-56tnl\" (UID: \"ded39b3c-186e-4798-a5e3-eefcef9ebd41\") " pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.047477 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.048230 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ded39b3c-186e-4798-a5e3-eefcef9ebd41-utilities\") pod \"certified-operators-56tnl\" (UID: \"ded39b3c-186e-4798-a5e3-eefcef9ebd41\") " pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.048261 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l85x\" (UniqueName: \"kubernetes.io/projected/ded39b3c-186e-4798-a5e3-eefcef9ebd41-kube-api-access-7l85x\") pod \"certified-operators-56tnl\" (UID: \"ded39b3c-186e-4798-a5e3-eefcef9ebd41\") " pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:17:22 crc kubenswrapper[4753]: E1005 20:17:22.048834 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:22.548821634 +0000 UTC m=+151.397149866 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.050770 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.051748 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.051399 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ded39b3c-186e-4798-a5e3-eefcef9ebd41-utilities\") pod \"certified-operators-56tnl\" (UID: \"ded39b3c-186e-4798-a5e3-eefcef9ebd41\") " pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.052693 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ded39b3c-186e-4798-a5e3-eefcef9ebd41-catalog-content\") pod \"certified-operators-56tnl\" (UID: \"ded39b3c-186e-4798-a5e3-eefcef9ebd41\") " pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.062831 4753 patch_prober.go:28] interesting pod/console-f9d7485db-7klvp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.062884 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7klvp" podUID="eb329af5-99e8-42d9-b79e-4c9acd09204d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.088043 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l85x\" (UniqueName: \"kubernetes.io/projected/ded39b3c-186e-4798-a5e3-eefcef9ebd41-kube-api-access-7l85x\") pod \"certified-operators-56tnl\" (UID: \"ded39b3c-186e-4798-a5e3-eefcef9ebd41\") " pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.091424 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.102674 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.102710 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.143201 4753 patch_prober.go:28] interesting pod/apiserver-76f77b778f-cmpqd container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 05 20:17:22 crc kubenswrapper[4753]: [+]log ok Oct 05 20:17:22 crc kubenswrapper[4753]: [+]etcd ok Oct 05 20:17:22 crc kubenswrapper[4753]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 05 20:17:22 crc kubenswrapper[4753]: [+]poststarthook/generic-apiserver-start-informers ok Oct 05 20:17:22 crc kubenswrapper[4753]: [+]poststarthook/max-in-flight-filter ok Oct 05 20:17:22 crc kubenswrapper[4753]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 05 20:17:22 crc kubenswrapper[4753]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 05 20:17:22 crc kubenswrapper[4753]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 05 20:17:22 crc kubenswrapper[4753]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 05 20:17:22 crc kubenswrapper[4753]: [+]poststarthook/project.openshift.io-projectcache ok Oct 05 20:17:22 crc kubenswrapper[4753]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 05 20:17:22 crc kubenswrapper[4753]: [+]poststarthook/openshift.io-startinformers ok Oct 05 20:17:22 crc kubenswrapper[4753]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 05 20:17:22 crc kubenswrapper[4753]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 05 20:17:22 crc kubenswrapper[4753]: livez check failed Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.143251 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" podUID="ad58fa0b-b61d-4afa-bf14-f82b2b976bdd" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.152489 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:22 crc kubenswrapper[4753]: E1005 20:17:22.152591 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:22.652575379 +0000 UTC m=+151.500903611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.153618 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e1c696-7a58-4427-a56d-6a774fa06532-catalog-content\") pod \"community-operators-mtszk\" (UID: \"a5e1c696-7a58-4427-a56d-6a774fa06532\") " pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.153673 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n92lx\" (UniqueName: \"kubernetes.io/projected/a5e1c696-7a58-4427-a56d-6a774fa06532-kube-api-access-n92lx\") pod \"community-operators-mtszk\" (UID: \"a5e1c696-7a58-4427-a56d-6a774fa06532\") " pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.153699 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:22 crc kubenswrapper[4753]: E1005 20:17:22.154085 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:22.654075455 +0000 UTC m=+151.502403677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.154910 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e1c696-7a58-4427-a56d-6a774fa06532-utilities\") pod \"community-operators-mtszk\" (UID: \"a5e1c696-7a58-4427-a56d-6a774fa06532\") " pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.158388 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.218038 4753 patch_prober.go:28] interesting pod/downloads-7954f5f757-bnpxs container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.218338 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-bnpxs" podUID="0cae56c9-6ca6-49f3-97d2-14e8a6748315" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.218072 4753 patch_prober.go:28] interesting pod/downloads-7954f5f757-bnpxs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.218417 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bnpxs" podUID="0cae56c9-6ca6-49f3-97d2-14e8a6748315" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.259851 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.260190 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e1c696-7a58-4427-a56d-6a774fa06532-utilities\") pod \"community-operators-mtszk\" (UID: \"a5e1c696-7a58-4427-a56d-6a774fa06532\") " pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.260247 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e1c696-7a58-4427-a56d-6a774fa06532-catalog-content\") pod \"community-operators-mtszk\" (UID: \"a5e1c696-7a58-4427-a56d-6a774fa06532\") " pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.260265 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n92lx\" (UniqueName: \"kubernetes.io/projected/a5e1c696-7a58-4427-a56d-6a774fa06532-kube-api-access-n92lx\") pod \"community-operators-mtszk\" (UID: \"a5e1c696-7a58-4427-a56d-6a774fa06532\") " pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:17:22 crc kubenswrapper[4753]: E1005 20:17:22.260615 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:22.760599574 +0000 UTC m=+151.608927806 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.262465 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e1c696-7a58-4427-a56d-6a774fa06532-utilities\") pod \"community-operators-mtszk\" (UID: \"a5e1c696-7a58-4427-a56d-6a774fa06532\") " pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.262998 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e1c696-7a58-4427-a56d-6a774fa06532-catalog-content\") pod \"community-operators-mtszk\" (UID: \"a5e1c696-7a58-4427-a56d-6a774fa06532\") " pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.298986 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n92lx\" (UniqueName: \"kubernetes.io/projected/a5e1c696-7a58-4427-a56d-6a774fa06532-kube-api-access-n92lx\") pod \"community-operators-mtszk\" (UID: \"a5e1c696-7a58-4427-a56d-6a774fa06532\") " pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.338434 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.361328 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:22 crc kubenswrapper[4753]: E1005 20:17:22.362015 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:22.862002637 +0000 UTC m=+151.710330869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.367131 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.439221 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:22 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:22 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:22 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.440582 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.464452 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:22 crc kubenswrapper[4753]: E1005 20:17:22.464822 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:22.964807454 +0000 UTC m=+151.813135686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.477532 4753 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.534625 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hldsk"] Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.565560 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:22 crc kubenswrapper[4753]: E1005 20:17:22.566181 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:23.066168066 +0000 UTC m=+151.914496298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.667301 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:22 crc kubenswrapper[4753]: E1005 20:17:22.667511 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:23.167497608 +0000 UTC m=+152.015825840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.777976 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:22 crc kubenswrapper[4753]: E1005 20:17:22.778862 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:23.278847963 +0000 UTC m=+152.127176205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.797358 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-56tnl"] Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.883943 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:22 crc kubenswrapper[4753]: E1005 20:17:22.890833 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-05 20:17:23.390779606 +0000 UTC m=+152.239107838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.964261 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8345c24a-2388-42cf-b5fc-6ba7a4825f04","Type":"ContainerStarted","Data":"7c2e2bc2f40b3d648294dc122f86f71abd81c359153be8be724584a121bcc47e"} Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.966519 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mtszk"] Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.989349 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ad5684a4e1db7210613153ebea766ec89549ce6b43a2607b46fbd7dc4af9073a"} Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.990556 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:22 crc kubenswrapper[4753]: E1005 20:17:22.990835 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-05 20:17:23.490824489 +0000 UTC m=+152.339152721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rr2d7" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 05 20:17:22 crc kubenswrapper[4753]: I1005 20:17:22.993619 4753 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-05T20:17:22.47755941Z","Handler":null,"Name":""} Oct 05 20:17:22 crc kubenswrapper[4753]: W1005 20:17:22.997251 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5e1c696_7a58_4427_a56d_6a774fa06532.slice/crio-5730dcc1b1a0394cb740404fa6f00225ed1eaa5374e461477718e6db0aec61dc WatchSource:0}: Error finding container 5730dcc1b1a0394cb740404fa6f00225ed1eaa5374e461477718e6db0aec61dc: Status 404 returned error can't find the container with id 5730dcc1b1a0394cb740404fa6f00225ed1eaa5374e461477718e6db0aec61dc Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.001483 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"40df7567d920a3efab8f913f10542df49d182441e302c480566d19121831582a"} Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.001921 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.020187 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hldsk" event={"ID":"befdf821-5bbf-4606-970d-ed88cf79993d","Type":"ContainerStarted","Data":"13c450557ee0aba43b33f09e9bfd9f9c74b8fcf80b043855c94b368def418e83"} Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.023043 4753 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.023067 4753 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.023820 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.027573 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56tnl" event={"ID":"ded39b3c-186e-4798-a5e3-eefcef9ebd41","Type":"ContainerStarted","Data":"20915f74f3f675a0c984ddf6ccabab545786f1c34f0730ed31c2d6cca9f9b603"} Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.040527 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"07eaa58b232e0d8542ddfaf3f881bb4dd78aeb9ce4543b831bf4a388b5fcf128"} Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.078626 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vm5zp"] Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.094603 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.139192 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.195804 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.200664 4753 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.200706 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.294724 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rr2d7\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.298759 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.380153 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.380232 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.386750 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.429230 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.440366 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:23 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:23 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:23 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.440434 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.536289 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gx875" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.547023 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rr2d7"] Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.588404 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7dx25"] Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.590164 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.598454 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.606553 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7dx25"] Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.705911 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0bfd881-720f-48b7-acdd-7bb4b3722471-catalog-content\") pod \"redhat-marketplace-7dx25\" (UID: \"c0bfd881-720f-48b7-acdd-7bb4b3722471\") " pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.706002 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0bfd881-720f-48b7-acdd-7bb4b3722471-utilities\") pod \"redhat-marketplace-7dx25\" (UID: \"c0bfd881-720f-48b7-acdd-7bb4b3722471\") " pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.706044 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28zrb\" (UniqueName: \"kubernetes.io/projected/c0bfd881-720f-48b7-acdd-7bb4b3722471-kube-api-access-28zrb\") pod \"redhat-marketplace-7dx25\" (UID: \"c0bfd881-720f-48b7-acdd-7bb4b3722471\") " pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.806873 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0bfd881-720f-48b7-acdd-7bb4b3722471-catalog-content\") pod \"redhat-marketplace-7dx25\" (UID: \"c0bfd881-720f-48b7-acdd-7bb4b3722471\") " pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.806931 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0bfd881-720f-48b7-acdd-7bb4b3722471-utilities\") pod \"redhat-marketplace-7dx25\" (UID: \"c0bfd881-720f-48b7-acdd-7bb4b3722471\") " pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.806962 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28zrb\" (UniqueName: \"kubernetes.io/projected/c0bfd881-720f-48b7-acdd-7bb4b3722471-kube-api-access-28zrb\") pod \"redhat-marketplace-7dx25\" (UID: \"c0bfd881-720f-48b7-acdd-7bb4b3722471\") " pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.807429 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0bfd881-720f-48b7-acdd-7bb4b3722471-catalog-content\") pod \"redhat-marketplace-7dx25\" (UID: \"c0bfd881-720f-48b7-acdd-7bb4b3722471\") " pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.807592 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0bfd881-720f-48b7-acdd-7bb4b3722471-utilities\") pod \"redhat-marketplace-7dx25\" (UID: \"c0bfd881-720f-48b7-acdd-7bb4b3722471\") " pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.843392 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28zrb\" (UniqueName: \"kubernetes.io/projected/c0bfd881-720f-48b7-acdd-7bb4b3722471-kube-api-access-28zrb\") pod \"redhat-marketplace-7dx25\" (UID: \"c0bfd881-720f-48b7-acdd-7bb4b3722471\") " pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.879711 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.913057 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.985254 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xm5lr"] Oct 05 20:17:23 crc kubenswrapper[4753]: I1005 20:17:23.986255 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.000492 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xm5lr"] Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.010705 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-utilities\") pod \"redhat-marketplace-xm5lr\" (UID: \"4c69edf5-c1fc-4392-bfb6-f52a114c4c19\") " pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.010761 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jhdn\" (UniqueName: \"kubernetes.io/projected/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-kube-api-access-8jhdn\") pod \"redhat-marketplace-xm5lr\" (UID: \"4c69edf5-c1fc-4392-bfb6-f52a114c4c19\") " pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.010797 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-catalog-content\") pod \"redhat-marketplace-xm5lr\" (UID: \"4c69edf5-c1fc-4392-bfb6-f52a114c4c19\") " pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.064812 4753 generic.go:334] "Generic (PLEG): container finished" podID="8345c24a-2388-42cf-b5fc-6ba7a4825f04" containerID="85b32123ebb2eed981692cadab0fb815ca541383a51d6b2981bb77c4737a3d0b" exitCode=0 Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.065174 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8345c24a-2388-42cf-b5fc-6ba7a4825f04","Type":"ContainerDied","Data":"85b32123ebb2eed981692cadab0fb815ca541383a51d6b2981bb77c4737a3d0b"} Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.066597 4753 generic.go:334] "Generic (PLEG): container finished" podID="846a9f6b-930c-4527-9b57-c8ca70b345d0" containerID="a096ab7d1afce8c26bac59e3e529bd11c0abcb608d8d1a8fb5ee636a1bb84a9b" exitCode=0 Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.066656 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vm5zp" event={"ID":"846a9f6b-930c-4527-9b57-c8ca70b345d0","Type":"ContainerDied","Data":"a096ab7d1afce8c26bac59e3e529bd11c0abcb608d8d1a8fb5ee636a1bb84a9b"} Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.066674 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vm5zp" event={"ID":"846a9f6b-930c-4527-9b57-c8ca70b345d0","Type":"ContainerStarted","Data":"db3f8ececd5b1c0d0e84d0278ca5ff8f06351f734f6201009fafa10198c661f7"} Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.074793 4753 generic.go:334] "Generic (PLEG): container finished" podID="befdf821-5bbf-4606-970d-ed88cf79993d" containerID="49480f646d71f5338fe80bdfee1ac389cffaba096294c38a080c007af7d78248" exitCode=0 Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.074865 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hldsk" event={"ID":"befdf821-5bbf-4606-970d-ed88cf79993d","Type":"ContainerDied","Data":"49480f646d71f5338fe80bdfee1ac389cffaba096294c38a080c007af7d78248"} Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.077773 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" event={"ID":"5da3ed48-9a09-47d3-9bcb-f572a962b5fb","Type":"ContainerStarted","Data":"932de19ce33539bd124888c4457db35d6c378cdd5b775c9a955f241d45cf5cfe"} Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.077802 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" event={"ID":"5da3ed48-9a09-47d3-9bcb-f572a962b5fb","Type":"ContainerStarted","Data":"c5ed0dbfd78c2a823622a02e623e1838096ac86b8576976e195764628fd165a3"} Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.078038 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.085991 4753 generic.go:334] "Generic (PLEG): container finished" podID="10c622e2-78c9-42bd-9031-776cded4435c" containerID="a12c5e14e454f061e96cbde4ee73a98879889ea378e7bd61c2821f3a9c3f6b8f" exitCode=0 Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.086066 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" event={"ID":"10c622e2-78c9-42bd-9031-776cded4435c","Type":"ContainerDied","Data":"a12c5e14e454f061e96cbde4ee73a98879889ea378e7bd61c2821f3a9c3f6b8f"} Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.091238 4753 generic.go:334] "Generic (PLEG): container finished" podID="ded39b3c-186e-4798-a5e3-eefcef9ebd41" containerID="ce4a0acf3a94c7591afed0b0bdc02191bcd5b70685d52443e5a62e12d609266c" exitCode=0 Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.091297 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56tnl" event={"ID":"ded39b3c-186e-4798-a5e3-eefcef9ebd41","Type":"ContainerDied","Data":"ce4a0acf3a94c7591afed0b0bdc02191bcd5b70685d52443e5a62e12d609266c"} Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.112510 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-utilities\") pod \"redhat-marketplace-xm5lr\" (UID: \"4c69edf5-c1fc-4392-bfb6-f52a114c4c19\") " pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.112709 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jhdn\" (UniqueName: \"kubernetes.io/projected/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-kube-api-access-8jhdn\") pod \"redhat-marketplace-xm5lr\" (UID: \"4c69edf5-c1fc-4392-bfb6-f52a114c4c19\") " pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.112786 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-catalog-content\") pod \"redhat-marketplace-xm5lr\" (UID: \"4c69edf5-c1fc-4392-bfb6-f52a114c4c19\") " pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.113268 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-utilities\") pod \"redhat-marketplace-xm5lr\" (UID: \"4c69edf5-c1fc-4392-bfb6-f52a114c4c19\") " pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.113576 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-catalog-content\") pod \"redhat-marketplace-xm5lr\" (UID: \"4c69edf5-c1fc-4392-bfb6-f52a114c4c19\") " pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.118728 4753 generic.go:334] "Generic (PLEG): container finished" podID="a5e1c696-7a58-4427-a56d-6a774fa06532" containerID="8c41c5bb8f4bddf25f220b953e565d5e1954e77e01ff6ac7aef105ce5df845cd" exitCode=0 Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.121277 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtszk" event={"ID":"a5e1c696-7a58-4427-a56d-6a774fa06532","Type":"ContainerDied","Data":"8c41c5bb8f4bddf25f220b953e565d5e1954e77e01ff6ac7aef105ce5df845cd"} Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.121391 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtszk" event={"ID":"a5e1c696-7a58-4427-a56d-6a774fa06532","Type":"ContainerStarted","Data":"5730dcc1b1a0394cb740404fa6f00225ed1eaa5374e461477718e6db0aec61dc"} Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.128703 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k5852" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.136453 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" podStartSLOduration=131.136409225 podStartE2EDuration="2m11.136409225s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:24.133080854 +0000 UTC m=+152.981409086" watchObservedRunningTime="2025-10-05 20:17:24.136409225 +0000 UTC m=+152.984737457" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.138645 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jhdn\" (UniqueName: \"kubernetes.io/projected/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-kube-api-access-8jhdn\") pod \"redhat-marketplace-xm5lr\" (UID: \"4c69edf5-c1fc-4392-bfb6-f52a114c4c19\") " pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.321089 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.384770 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7dx25"] Oct 05 20:17:24 crc kubenswrapper[4753]: W1005 20:17:24.401754 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0bfd881_720f_48b7_acdd_7bb4b3722471.slice/crio-2c4b026c3300f58faef47435768d5f8369e9eaffe280a227f462a6390fa03ec4 WatchSource:0}: Error finding container 2c4b026c3300f58faef47435768d5f8369e9eaffe280a227f462a6390fa03ec4: Status 404 returned error can't find the container with id 2c4b026c3300f58faef47435768d5f8369e9eaffe280a227f462a6390fa03ec4 Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.427163 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:24 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:24 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:24 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.427229 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.591741 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xwh7d"] Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.593721 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.596527 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.610052 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xwh7d"] Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.725499 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh9xd\" (UniqueName: \"kubernetes.io/projected/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-kube-api-access-qh9xd\") pod \"redhat-operators-xwh7d\" (UID: \"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4\") " pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.725641 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-catalog-content\") pod \"redhat-operators-xwh7d\" (UID: \"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4\") " pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.725710 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-utilities\") pod \"redhat-operators-xwh7d\" (UID: \"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4\") " pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.806447 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xm5lr"] Oct 05 20:17:24 crc kubenswrapper[4753]: W1005 20:17:24.810613 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c69edf5_c1fc_4392_bfb6_f52a114c4c19.slice/crio-e1c169d5d583b10fc701772466b1dc306963baba269084b91138d864d8060790 WatchSource:0}: Error finding container e1c169d5d583b10fc701772466b1dc306963baba269084b91138d864d8060790: Status 404 returned error can't find the container with id e1c169d5d583b10fc701772466b1dc306963baba269084b91138d864d8060790 Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.827450 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh9xd\" (UniqueName: \"kubernetes.io/projected/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-kube-api-access-qh9xd\") pod \"redhat-operators-xwh7d\" (UID: \"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4\") " pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.827691 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-utilities\") pod \"redhat-operators-xwh7d\" (UID: \"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4\") " pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.827765 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-catalog-content\") pod \"redhat-operators-xwh7d\" (UID: \"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4\") " pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.828243 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-utilities\") pod \"redhat-operators-xwh7d\" (UID: \"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4\") " pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.828354 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-catalog-content\") pod \"redhat-operators-xwh7d\" (UID: \"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4\") " pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.870670 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh9xd\" (UniqueName: \"kubernetes.io/projected/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-kube-api-access-qh9xd\") pod \"redhat-operators-xwh7d\" (UID: \"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4\") " pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.924579 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.993652 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p4pcf"] Oct 05 20:17:24 crc kubenswrapper[4753]: I1005 20:17:24.994699 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.063124 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p4pcf"] Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.127907 4753 generic.go:334] "Generic (PLEG): container finished" podID="c0bfd881-720f-48b7-acdd-7bb4b3722471" containerID="64b9605d6a8db1f69e151a64497c4f9018a07ce2deb8af5c63de768e7ff270ee" exitCode=0 Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.127978 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dx25" event={"ID":"c0bfd881-720f-48b7-acdd-7bb4b3722471","Type":"ContainerDied","Data":"64b9605d6a8db1f69e151a64497c4f9018a07ce2deb8af5c63de768e7ff270ee"} Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.128005 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dx25" event={"ID":"c0bfd881-720f-48b7-acdd-7bb4b3722471","Type":"ContainerStarted","Data":"2c4b026c3300f58faef47435768d5f8369e9eaffe280a227f462a6390fa03ec4"} Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.129532 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xm5lr" event={"ID":"4c69edf5-c1fc-4392-bfb6-f52a114c4c19","Type":"ContainerStarted","Data":"e1c169d5d583b10fc701772466b1dc306963baba269084b91138d864d8060790"} Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.132011 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6lrf\" (UniqueName: \"kubernetes.io/projected/7329a47a-c470-4d01-92c6-9addeedeb6e1-kube-api-access-n6lrf\") pod \"redhat-operators-p4pcf\" (UID: \"7329a47a-c470-4d01-92c6-9addeedeb6e1\") " pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.132048 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7329a47a-c470-4d01-92c6-9addeedeb6e1-catalog-content\") pod \"redhat-operators-p4pcf\" (UID: \"7329a47a-c470-4d01-92c6-9addeedeb6e1\") " pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.132101 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7329a47a-c470-4d01-92c6-9addeedeb6e1-utilities\") pod \"redhat-operators-p4pcf\" (UID: \"7329a47a-c470-4d01-92c6-9addeedeb6e1\") " pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.233154 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6lrf\" (UniqueName: \"kubernetes.io/projected/7329a47a-c470-4d01-92c6-9addeedeb6e1-kube-api-access-n6lrf\") pod \"redhat-operators-p4pcf\" (UID: \"7329a47a-c470-4d01-92c6-9addeedeb6e1\") " pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.233198 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7329a47a-c470-4d01-92c6-9addeedeb6e1-catalog-content\") pod \"redhat-operators-p4pcf\" (UID: \"7329a47a-c470-4d01-92c6-9addeedeb6e1\") " pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.233291 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7329a47a-c470-4d01-92c6-9addeedeb6e1-utilities\") pod \"redhat-operators-p4pcf\" (UID: \"7329a47a-c470-4d01-92c6-9addeedeb6e1\") " pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.234436 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7329a47a-c470-4d01-92c6-9addeedeb6e1-catalog-content\") pod \"redhat-operators-p4pcf\" (UID: \"7329a47a-c470-4d01-92c6-9addeedeb6e1\") " pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.235283 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7329a47a-c470-4d01-92c6-9addeedeb6e1-utilities\") pod \"redhat-operators-p4pcf\" (UID: \"7329a47a-c470-4d01-92c6-9addeedeb6e1\") " pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.258697 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6lrf\" (UniqueName: \"kubernetes.io/projected/7329a47a-c470-4d01-92c6-9addeedeb6e1-kube-api-access-n6lrf\") pod \"redhat-operators-p4pcf\" (UID: \"7329a47a-c470-4d01-92c6-9addeedeb6e1\") " pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.279521 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xwh7d"] Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.330860 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.406016 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.428543 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:25 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:25 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:25 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.428600 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.474377 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.539904 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8345c24a-2388-42cf-b5fc-6ba7a4825f04-kubelet-dir\") pod \"8345c24a-2388-42cf-b5fc-6ba7a4825f04\" (UID: \"8345c24a-2388-42cf-b5fc-6ba7a4825f04\") " Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.540052 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8345c24a-2388-42cf-b5fc-6ba7a4825f04-kube-api-access\") pod \"8345c24a-2388-42cf-b5fc-6ba7a4825f04\" (UID: \"8345c24a-2388-42cf-b5fc-6ba7a4825f04\") " Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.540949 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8345c24a-2388-42cf-b5fc-6ba7a4825f04-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8345c24a-2388-42cf-b5fc-6ba7a4825f04" (UID: "8345c24a-2388-42cf-b5fc-6ba7a4825f04"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.559417 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8345c24a-2388-42cf-b5fc-6ba7a4825f04-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8345c24a-2388-42cf-b5fc-6ba7a4825f04" (UID: "8345c24a-2388-42cf-b5fc-6ba7a4825f04"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.622665 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p4pcf"] Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.642124 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkzzj\" (UniqueName: \"kubernetes.io/projected/10c622e2-78c9-42bd-9031-776cded4435c-kube-api-access-nkzzj\") pod \"10c622e2-78c9-42bd-9031-776cded4435c\" (UID: \"10c622e2-78c9-42bd-9031-776cded4435c\") " Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.642191 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10c622e2-78c9-42bd-9031-776cded4435c-secret-volume\") pod \"10c622e2-78c9-42bd-9031-776cded4435c\" (UID: \"10c622e2-78c9-42bd-9031-776cded4435c\") " Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.642266 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10c622e2-78c9-42bd-9031-776cded4435c-config-volume\") pod \"10c622e2-78c9-42bd-9031-776cded4435c\" (UID: \"10c622e2-78c9-42bd-9031-776cded4435c\") " Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.643322 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10c622e2-78c9-42bd-9031-776cded4435c-config-volume" (OuterVolumeSpecName: "config-volume") pod "10c622e2-78c9-42bd-9031-776cded4435c" (UID: "10c622e2-78c9-42bd-9031-776cded4435c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.645416 4753 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8345c24a-2388-42cf-b5fc-6ba7a4825f04-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.645434 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8345c24a-2388-42cf-b5fc-6ba7a4825f04-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.645446 4753 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10c622e2-78c9-42bd-9031-776cded4435c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 05 20:17:25 crc kubenswrapper[4753]: W1005 20:17:25.647923 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7329a47a_c470_4d01_92c6_9addeedeb6e1.slice/crio-c6910f0c3cf360bd92c10b9bd640ec3de7251df87d6fcb4eda6e8012c57f3cf3 WatchSource:0}: Error finding container c6910f0c3cf360bd92c10b9bd640ec3de7251df87d6fcb4eda6e8012c57f3cf3: Status 404 returned error can't find the container with id c6910f0c3cf360bd92c10b9bd640ec3de7251df87d6fcb4eda6e8012c57f3cf3 Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.649959 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c622e2-78c9-42bd-9031-776cded4435c-kube-api-access-nkzzj" (OuterVolumeSpecName: "kube-api-access-nkzzj") pod "10c622e2-78c9-42bd-9031-776cded4435c" (UID: "10c622e2-78c9-42bd-9031-776cded4435c"). InnerVolumeSpecName "kube-api-access-nkzzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.651277 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c622e2-78c9-42bd-9031-776cded4435c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "10c622e2-78c9-42bd-9031-776cded4435c" (UID: "10c622e2-78c9-42bd-9031-776cded4435c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.747392 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkzzj\" (UniqueName: \"kubernetes.io/projected/10c622e2-78c9-42bd-9031-776cded4435c-kube-api-access-nkzzj\") on node \"crc\" DevicePath \"\"" Oct 05 20:17:25 crc kubenswrapper[4753]: I1005 20:17:25.747428 4753 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10c622e2-78c9-42bd-9031-776cded4435c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.147695 4753 generic.go:334] "Generic (PLEG): container finished" podID="4c69edf5-c1fc-4392-bfb6-f52a114c4c19" containerID="1f42c035479a7261e2815c3f19cee3e5b35d6f24911688232728da15dd0375dd" exitCode=0 Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.147794 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xm5lr" event={"ID":"4c69edf5-c1fc-4392-bfb6-f52a114c4c19","Type":"ContainerDied","Data":"1f42c035479a7261e2815c3f19cee3e5b35d6f24911688232728da15dd0375dd"} Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.164862 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.165719 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm" event={"ID":"10c622e2-78c9-42bd-9031-776cded4435c","Type":"ContainerDied","Data":"a6d1f167cb92b234b105f681f9be8b86ec592408a577cdb7b1d4dca4f30eaa18"} Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.165764 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6d1f167cb92b234b105f681f9be8b86ec592408a577cdb7b1d4dca4f30eaa18" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.169680 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8345c24a-2388-42cf-b5fc-6ba7a4825f04","Type":"ContainerDied","Data":"7c2e2bc2f40b3d648294dc122f86f71abd81c359153be8be724584a121bcc47e"} Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.169713 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c2e2bc2f40b3d648294dc122f86f71abd81c359153be8be724584a121bcc47e" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.169762 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.176656 4753 generic.go:334] "Generic (PLEG): container finished" podID="7329a47a-c470-4d01-92c6-9addeedeb6e1" containerID="7cc46185e644bf4f2ecde44beb89545766161fafd57e9d5f6ed19ed96682270d" exitCode=0 Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.176729 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4pcf" event={"ID":"7329a47a-c470-4d01-92c6-9addeedeb6e1","Type":"ContainerDied","Data":"7cc46185e644bf4f2ecde44beb89545766161fafd57e9d5f6ed19ed96682270d"} Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.176762 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4pcf" event={"ID":"7329a47a-c470-4d01-92c6-9addeedeb6e1","Type":"ContainerStarted","Data":"c6910f0c3cf360bd92c10b9bd640ec3de7251df87d6fcb4eda6e8012c57f3cf3"} Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.187915 4753 generic.go:334] "Generic (PLEG): container finished" podID="431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" containerID="25c45cc5d0c58c65554d2e948c2219792a406c46f4cb233dff98d2b90b27bc38" exitCode=0 Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.189308 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwh7d" event={"ID":"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4","Type":"ContainerDied","Data":"25c45cc5d0c58c65554d2e948c2219792a406c46f4cb233dff98d2b90b27bc38"} Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.189360 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwh7d" event={"ID":"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4","Type":"ContainerStarted","Data":"c19884472f37aa9fd78e2a2faf72acc8ca443ea504f5362a81a80427fc7cecc2"} Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.349464 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 05 20:17:26 crc kubenswrapper[4753]: E1005 20:17:26.353309 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8345c24a-2388-42cf-b5fc-6ba7a4825f04" containerName="pruner" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.353410 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8345c24a-2388-42cf-b5fc-6ba7a4825f04" containerName="pruner" Oct 05 20:17:26 crc kubenswrapper[4753]: E1005 20:17:26.353529 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c622e2-78c9-42bd-9031-776cded4435c" containerName="collect-profiles" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.353601 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c622e2-78c9-42bd-9031-776cded4435c" containerName="collect-profiles" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.353767 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c622e2-78c9-42bd-9031-776cded4435c" containerName="collect-profiles" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.353832 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="8345c24a-2388-42cf-b5fc-6ba7a4825f04" containerName="pruner" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.354521 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.358690 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.368174 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.373816 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.428193 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:26 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:26 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:26 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.428261 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.456354 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/051fe433-d2b7-4c42-8b42-ce0bd8795b58-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"051fe433-d2b7-4c42-8b42-ce0bd8795b58\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.456408 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/051fe433-d2b7-4c42-8b42-ce0bd8795b58-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"051fe433-d2b7-4c42-8b42-ce0bd8795b58\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.558345 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/051fe433-d2b7-4c42-8b42-ce0bd8795b58-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"051fe433-d2b7-4c42-8b42-ce0bd8795b58\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.558401 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/051fe433-d2b7-4c42-8b42-ce0bd8795b58-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"051fe433-d2b7-4c42-8b42-ce0bd8795b58\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.558485 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/051fe433-d2b7-4c42-8b42-ce0bd8795b58-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"051fe433-d2b7-4c42-8b42-ce0bd8795b58\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.578219 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/051fe433-d2b7-4c42-8b42-ce0bd8795b58-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"051fe433-d2b7-4c42-8b42-ce0bd8795b58\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 05 20:17:26 crc kubenswrapper[4753]: I1005 20:17:26.685000 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 05 20:17:27 crc kubenswrapper[4753]: I1005 20:17:27.063590 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 05 20:17:27 crc kubenswrapper[4753]: I1005 20:17:27.108417 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:27 crc kubenswrapper[4753]: I1005 20:17:27.114235 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-cmpqd" Oct 05 20:17:27 crc kubenswrapper[4753]: I1005 20:17:27.214504 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"051fe433-d2b7-4c42-8b42-ce0bd8795b58","Type":"ContainerStarted","Data":"9f02a308185054a6d7bdfd003d7b17d9c8bd03ed373efb066f892ec66525136d"} Oct 05 20:17:27 crc kubenswrapper[4753]: I1005 20:17:27.434056 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:27 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:27 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:27 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:27 crc kubenswrapper[4753]: I1005 20:17:27.434111 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:28 crc kubenswrapper[4753]: I1005 20:17:28.252539 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"051fe433-d2b7-4c42-8b42-ce0bd8795b58","Type":"ContainerStarted","Data":"55de62fe6f6f0ba3c537997b87de4e4ec263721763cdfbc0528c2c3ae83c72d9"} Oct 05 20:17:28 crc kubenswrapper[4753]: I1005 20:17:28.268079 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.268062968 podStartE2EDuration="2.268062968s" podCreationTimestamp="2025-10-05 20:17:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:28.266102689 +0000 UTC m=+157.114430921" watchObservedRunningTime="2025-10-05 20:17:28.268062968 +0000 UTC m=+157.116391200" Oct 05 20:17:28 crc kubenswrapper[4753]: I1005 20:17:28.426534 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:28 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:28 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:28 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:28 crc kubenswrapper[4753]: I1005 20:17:28.426587 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:28 crc kubenswrapper[4753]: I1005 20:17:28.597557 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-459zr" Oct 05 20:17:29 crc kubenswrapper[4753]: I1005 20:17:29.265625 4753 generic.go:334] "Generic (PLEG): container finished" podID="051fe433-d2b7-4c42-8b42-ce0bd8795b58" containerID="55de62fe6f6f0ba3c537997b87de4e4ec263721763cdfbc0528c2c3ae83c72d9" exitCode=0 Oct 05 20:17:29 crc kubenswrapper[4753]: I1005 20:17:29.265680 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"051fe433-d2b7-4c42-8b42-ce0bd8795b58","Type":"ContainerDied","Data":"55de62fe6f6f0ba3c537997b87de4e4ec263721763cdfbc0528c2c3ae83c72d9"} Oct 05 20:17:29 crc kubenswrapper[4753]: I1005 20:17:29.428301 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:29 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:29 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:29 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:29 crc kubenswrapper[4753]: I1005 20:17:29.428364 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:30 crc kubenswrapper[4753]: I1005 20:17:30.427152 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:30 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:30 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:30 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:30 crc kubenswrapper[4753]: I1005 20:17:30.427440 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:30 crc kubenswrapper[4753]: I1005 20:17:30.832418 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 05 20:17:30 crc kubenswrapper[4753]: I1005 20:17:30.929604 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/051fe433-d2b7-4c42-8b42-ce0bd8795b58-kubelet-dir\") pod \"051fe433-d2b7-4c42-8b42-ce0bd8795b58\" (UID: \"051fe433-d2b7-4c42-8b42-ce0bd8795b58\") " Oct 05 20:17:30 crc kubenswrapper[4753]: I1005 20:17:30.929733 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/051fe433-d2b7-4c42-8b42-ce0bd8795b58-kube-api-access\") pod \"051fe433-d2b7-4c42-8b42-ce0bd8795b58\" (UID: \"051fe433-d2b7-4c42-8b42-ce0bd8795b58\") " Oct 05 20:17:30 crc kubenswrapper[4753]: I1005 20:17:30.929736 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/051fe433-d2b7-4c42-8b42-ce0bd8795b58-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "051fe433-d2b7-4c42-8b42-ce0bd8795b58" (UID: "051fe433-d2b7-4c42-8b42-ce0bd8795b58"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:17:30 crc kubenswrapper[4753]: I1005 20:17:30.929968 4753 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/051fe433-d2b7-4c42-8b42-ce0bd8795b58-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 05 20:17:30 crc kubenswrapper[4753]: I1005 20:17:30.962835 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/051fe433-d2b7-4c42-8b42-ce0bd8795b58-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "051fe433-d2b7-4c42-8b42-ce0bd8795b58" (UID: "051fe433-d2b7-4c42-8b42-ce0bd8795b58"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:17:31 crc kubenswrapper[4753]: I1005 20:17:31.031695 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/051fe433-d2b7-4c42-8b42-ce0bd8795b58-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 05 20:17:31 crc kubenswrapper[4753]: I1005 20:17:31.314396 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"051fe433-d2b7-4c42-8b42-ce0bd8795b58","Type":"ContainerDied","Data":"9f02a308185054a6d7bdfd003d7b17d9c8bd03ed373efb066f892ec66525136d"} Oct 05 20:17:31 crc kubenswrapper[4753]: I1005 20:17:31.314700 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f02a308185054a6d7bdfd003d7b17d9c8bd03ed373efb066f892ec66525136d" Oct 05 20:17:31 crc kubenswrapper[4753]: I1005 20:17:31.314480 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 05 20:17:31 crc kubenswrapper[4753]: I1005 20:17:31.426617 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:31 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:31 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:31 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:31 crc kubenswrapper[4753]: I1005 20:17:31.426674 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:32 crc kubenswrapper[4753]: I1005 20:17:32.051023 4753 patch_prober.go:28] interesting pod/console-f9d7485db-7klvp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 05 20:17:32 crc kubenswrapper[4753]: I1005 20:17:32.051074 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7klvp" podUID="eb329af5-99e8-42d9-b79e-4c9acd09204d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 05 20:17:32 crc kubenswrapper[4753]: I1005 20:17:32.216949 4753 patch_prober.go:28] interesting pod/downloads-7954f5f757-bnpxs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Oct 05 20:17:32 crc kubenswrapper[4753]: I1005 20:17:32.216976 4753 patch_prober.go:28] interesting pod/downloads-7954f5f757-bnpxs container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Oct 05 20:17:32 crc kubenswrapper[4753]: I1005 20:17:32.217024 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bnpxs" podUID="0cae56c9-6ca6-49f3-97d2-14e8a6748315" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Oct 05 20:17:32 crc kubenswrapper[4753]: I1005 20:17:32.217057 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-bnpxs" podUID="0cae56c9-6ca6-49f3-97d2-14e8a6748315" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Oct 05 20:17:32 crc kubenswrapper[4753]: I1005 20:17:32.429507 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:32 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:32 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:32 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:32 crc kubenswrapper[4753]: I1005 20:17:32.429556 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:33 crc kubenswrapper[4753]: I1005 20:17:33.426708 4753 patch_prober.go:28] interesting pod/router-default-5444994796-zbdx5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 05 20:17:33 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Oct 05 20:17:33 crc kubenswrapper[4753]: [+]process-running ok Oct 05 20:17:33 crc kubenswrapper[4753]: healthz check failed Oct 05 20:17:33 crc kubenswrapper[4753]: I1005 20:17:33.426782 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zbdx5" podUID="e8855a1b-3e62-4ed3-acb0-eb0663d8df01" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 05 20:17:34 crc kubenswrapper[4753]: I1005 20:17:34.427600 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:34 crc kubenswrapper[4753]: I1005 20:17:34.430068 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-zbdx5" Oct 05 20:17:34 crc kubenswrapper[4753]: I1005 20:17:34.494284 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:17:34 crc kubenswrapper[4753]: I1005 20:17:34.494335 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:17:36 crc kubenswrapper[4753]: I1005 20:17:36.131130 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs\") pod \"network-metrics-daemon-ktspr\" (UID: \"f99b8ef3-70ed-42e4-9217-a300fcd562d9\") " pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:17:36 crc kubenswrapper[4753]: I1005 20:17:36.138971 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f99b8ef3-70ed-42e4-9217-a300fcd562d9-metrics-certs\") pod \"network-metrics-daemon-ktspr\" (UID: \"f99b8ef3-70ed-42e4-9217-a300fcd562d9\") " pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:17:36 crc kubenswrapper[4753]: I1005 20:17:36.273669 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ktspr" Oct 05 20:17:42 crc kubenswrapper[4753]: I1005 20:17:42.056241 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:42 crc kubenswrapper[4753]: I1005 20:17:42.062920 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:17:42 crc kubenswrapper[4753]: I1005 20:17:42.233527 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-bnpxs" Oct 05 20:17:43 crc kubenswrapper[4753]: I1005 20:17:43.306719 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:17:52 crc kubenswrapper[4753]: E1005 20:17:52.240303 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 05 20:17:52 crc kubenswrapper[4753]: E1005 20:17:52.240996 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6mrvv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hldsk_openshift-marketplace(befdf821-5bbf-4606-970d-ed88cf79993d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 05 20:17:52 crc kubenswrapper[4753]: E1005 20:17:52.244040 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hldsk" podUID="befdf821-5bbf-4606-970d-ed88cf79993d" Oct 05 20:17:52 crc kubenswrapper[4753]: E1005 20:17:52.883042 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 05 20:17:52 crc kubenswrapper[4753]: E1005 20:17:52.883295 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n92lx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mtszk_openshift-marketplace(a5e1c696-7a58-4427-a56d-6a774fa06532): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 05 20:17:52 crc kubenswrapper[4753]: E1005 20:17:52.884446 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-mtszk" podUID="a5e1c696-7a58-4427-a56d-6a774fa06532" Oct 05 20:17:53 crc kubenswrapper[4753]: I1005 20:17:53.538925 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vvtsf" Oct 05 20:17:55 crc kubenswrapper[4753]: E1005 20:17:55.521340 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hldsk" podUID="befdf821-5bbf-4606-970d-ed88cf79993d" Oct 05 20:17:55 crc kubenswrapper[4753]: E1005 20:17:55.522480 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mtszk" podUID="a5e1c696-7a58-4427-a56d-6a774fa06532" Oct 05 20:17:55 crc kubenswrapper[4753]: E1005 20:17:55.764585 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 05 20:17:55 crc kubenswrapper[4753]: E1005 20:17:55.764765 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qh9xd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xwh7d_openshift-marketplace(431bdbc7-61ed-4d70-9d8c-6576bc51e2d4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 05 20:17:55 crc kubenswrapper[4753]: E1005 20:17:55.766117 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xwh7d" podUID="431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" Oct 05 20:17:55 crc kubenswrapper[4753]: E1005 20:17:55.830832 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 05 20:17:55 crc kubenswrapper[4753]: E1005 20:17:55.831080 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n6lrf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-p4pcf_openshift-marketplace(7329a47a-c470-4d01-92c6-9addeedeb6e1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 05 20:17:55 crc kubenswrapper[4753]: E1005 20:17:55.832429 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-p4pcf" podUID="7329a47a-c470-4d01-92c6-9addeedeb6e1" Oct 05 20:17:56 crc kubenswrapper[4753]: E1005 20:17:56.460684 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 05 20:17:56 crc kubenswrapper[4753]: E1005 20:17:56.461133 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8jhdn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-xm5lr_openshift-marketplace(4c69edf5-c1fc-4392-bfb6-f52a114c4c19): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 05 20:17:56 crc kubenswrapper[4753]: E1005 20:17:56.462362 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-xm5lr" podUID="4c69edf5-c1fc-4392-bfb6-f52a114c4c19" Oct 05 20:17:56 crc kubenswrapper[4753]: E1005 20:17:56.503369 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 05 20:17:56 crc kubenswrapper[4753]: E1005 20:17:56.503508 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-28zrb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-7dx25_openshift-marketplace(c0bfd881-720f-48b7-acdd-7bb4b3722471): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 05 20:17:56 crc kubenswrapper[4753]: E1005 20:17:56.540765 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-7dx25" podUID="c0bfd881-720f-48b7-acdd-7bb4b3722471" Oct 05 20:17:56 crc kubenswrapper[4753]: E1005 20:17:56.553652 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-p4pcf" podUID="7329a47a-c470-4d01-92c6-9addeedeb6e1" Oct 05 20:17:56 crc kubenswrapper[4753]: E1005 20:17:56.553660 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xwh7d" podUID="431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" Oct 05 20:17:56 crc kubenswrapper[4753]: E1005 20:17:56.560818 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-xm5lr" podUID="4c69edf5-c1fc-4392-bfb6-f52a114c4c19" Oct 05 20:17:56 crc kubenswrapper[4753]: I1005 20:17:56.871973 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ktspr"] Oct 05 20:17:57 crc kubenswrapper[4753]: I1005 20:17:57.556190 4753 generic.go:334] "Generic (PLEG): container finished" podID="846a9f6b-930c-4527-9b57-c8ca70b345d0" containerID="051f249e6c3e3695bad48c340baa44157c9f5e7457c680751043e8c090962b36" exitCode=0 Oct 05 20:17:57 crc kubenswrapper[4753]: I1005 20:17:57.556278 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vm5zp" event={"ID":"846a9f6b-930c-4527-9b57-c8ca70b345d0","Type":"ContainerDied","Data":"051f249e6c3e3695bad48c340baa44157c9f5e7457c680751043e8c090962b36"} Oct 05 20:17:57 crc kubenswrapper[4753]: I1005 20:17:57.560053 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ktspr" event={"ID":"f99b8ef3-70ed-42e4-9217-a300fcd562d9","Type":"ContainerStarted","Data":"5a440d4f8ee8e679b91293189dcaf6451413caf4b6c409ee64eeedcce4faa567"} Oct 05 20:17:57 crc kubenswrapper[4753]: I1005 20:17:57.560075 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ktspr" event={"ID":"f99b8ef3-70ed-42e4-9217-a300fcd562d9","Type":"ContainerStarted","Data":"7f69cef8bcde31937d85acfb26349e62c1fe29914947f282977b143e8a8ebd22"} Oct 05 20:17:57 crc kubenswrapper[4753]: I1005 20:17:57.563182 4753 generic.go:334] "Generic (PLEG): container finished" podID="ded39b3c-186e-4798-a5e3-eefcef9ebd41" containerID="ca5e173a18893323ab9fb65d1db5c6bee0355794985374b0b11eaed060bb66bf" exitCode=0 Oct 05 20:17:57 crc kubenswrapper[4753]: I1005 20:17:57.563235 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56tnl" event={"ID":"ded39b3c-186e-4798-a5e3-eefcef9ebd41","Type":"ContainerDied","Data":"ca5e173a18893323ab9fb65d1db5c6bee0355794985374b0b11eaed060bb66bf"} Oct 05 20:17:58 crc kubenswrapper[4753]: I1005 20:17:58.575214 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ktspr" event={"ID":"f99b8ef3-70ed-42e4-9217-a300fcd562d9","Type":"ContainerStarted","Data":"13a01dd88059f32142ac53ae14bae92ad15b0d47b498b53264e558a48811a8cc"} Oct 05 20:17:58 crc kubenswrapper[4753]: I1005 20:17:58.594114 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ktspr" podStartSLOduration=165.594090102 podStartE2EDuration="2m45.594090102s" podCreationTimestamp="2025-10-05 20:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:17:58.593474083 +0000 UTC m=+187.441802345" watchObservedRunningTime="2025-10-05 20:17:58.594090102 +0000 UTC m=+187.442418354" Oct 05 20:17:59 crc kubenswrapper[4753]: I1005 20:17:59.832573 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 05 20:18:01 crc kubenswrapper[4753]: I1005 20:18:01.596016 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vm5zp" event={"ID":"846a9f6b-930c-4527-9b57-c8ca70b345d0","Type":"ContainerStarted","Data":"297ec75f0e8766754f4ee23a52357d7fa22c76473434d5fa3513ea22c2cfcb1e"} Oct 05 20:18:01 crc kubenswrapper[4753]: I1005 20:18:01.598258 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56tnl" event={"ID":"ded39b3c-186e-4798-a5e3-eefcef9ebd41","Type":"ContainerStarted","Data":"669c16fa7de7657d438af283f119f6b5b44b1eb3aba266d078c34f7b14aedf58"} Oct 05 20:18:01 crc kubenswrapper[4753]: I1005 20:18:01.620978 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vm5zp" podStartSLOduration=4.04682539 podStartE2EDuration="40.620951451s" podCreationTimestamp="2025-10-05 20:17:21 +0000 UTC" firstStartedPulling="2025-10-05 20:17:24.068123356 +0000 UTC m=+152.916451598" lastFinishedPulling="2025-10-05 20:18:00.642249397 +0000 UTC m=+189.490577659" observedRunningTime="2025-10-05 20:18:01.618588968 +0000 UTC m=+190.466917200" watchObservedRunningTime="2025-10-05 20:18:01.620951451 +0000 UTC m=+190.469279683" Oct 05 20:18:01 crc kubenswrapper[4753]: I1005 20:18:01.664520 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-56tnl" podStartSLOduration=5.216339962 podStartE2EDuration="40.664504273s" podCreationTimestamp="2025-10-05 20:17:21 +0000 UTC" firstStartedPulling="2025-10-05 20:17:24.101008532 +0000 UTC m=+152.949336764" lastFinishedPulling="2025-10-05 20:17:59.549172813 +0000 UTC m=+188.397501075" observedRunningTime="2025-10-05 20:18:01.663280146 +0000 UTC m=+190.511608378" watchObservedRunningTime="2025-10-05 20:18:01.664504273 +0000 UTC m=+190.512832505" Oct 05 20:18:02 crc kubenswrapper[4753]: I1005 20:18:02.092340 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:18:02 crc kubenswrapper[4753]: I1005 20:18:02.092869 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:18:02 crc kubenswrapper[4753]: I1005 20:18:02.159170 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:18:02 crc kubenswrapper[4753]: I1005 20:18:02.159229 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:18:03 crc kubenswrapper[4753]: I1005 20:18:03.213014 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-vm5zp" podUID="846a9f6b-930c-4527-9b57-c8ca70b345d0" containerName="registry-server" probeResult="failure" output=< Oct 05 20:18:03 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 20:18:03 crc kubenswrapper[4753]: > Oct 05 20:18:03 crc kubenswrapper[4753]: I1005 20:18:03.216701 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-56tnl" podUID="ded39b3c-186e-4798-a5e3-eefcef9ebd41" containerName="registry-server" probeResult="failure" output=< Oct 05 20:18:03 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 20:18:03 crc kubenswrapper[4753]: > Oct 05 20:18:04 crc kubenswrapper[4753]: I1005 20:18:04.489568 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:18:04 crc kubenswrapper[4753]: I1005 20:18:04.490010 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:18:09 crc kubenswrapper[4753]: I1005 20:18:09.639921 4753 generic.go:334] "Generic (PLEG): container finished" podID="7329a47a-c470-4d01-92c6-9addeedeb6e1" containerID="74e0e6a70bb71499e35344d38e15b98160f40f414ff0b15d82654ba5d901cf90" exitCode=0 Oct 05 20:18:09 crc kubenswrapper[4753]: I1005 20:18:09.640012 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4pcf" event={"ID":"7329a47a-c470-4d01-92c6-9addeedeb6e1","Type":"ContainerDied","Data":"74e0e6a70bb71499e35344d38e15b98160f40f414ff0b15d82654ba5d901cf90"} Oct 05 20:18:09 crc kubenswrapper[4753]: I1005 20:18:09.644219 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xm5lr" event={"ID":"4c69edf5-c1fc-4392-bfb6-f52a114c4c19","Type":"ContainerStarted","Data":"5d89ec26f6e51c3c036fb6edb8ffdd385ae5f3f996b38e2748511e0eee8cb033"} Oct 05 20:18:10 crc kubenswrapper[4753]: I1005 20:18:10.650941 4753 generic.go:334] "Generic (PLEG): container finished" podID="befdf821-5bbf-4606-970d-ed88cf79993d" containerID="f4ad276cf1611492eeb4f394d0668466e6eb70bed0f578f84682d0508e441c2d" exitCode=0 Oct 05 20:18:10 crc kubenswrapper[4753]: I1005 20:18:10.651035 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hldsk" event={"ID":"befdf821-5bbf-4606-970d-ed88cf79993d","Type":"ContainerDied","Data":"f4ad276cf1611492eeb4f394d0668466e6eb70bed0f578f84682d0508e441c2d"} Oct 05 20:18:10 crc kubenswrapper[4753]: I1005 20:18:10.653582 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4pcf" event={"ID":"7329a47a-c470-4d01-92c6-9addeedeb6e1","Type":"ContainerStarted","Data":"1cf0adcba6ebc2cae6c6f3ce12d02988e244c8e27251f1a5e241316974df7fac"} Oct 05 20:18:10 crc kubenswrapper[4753]: I1005 20:18:10.655399 4753 generic.go:334] "Generic (PLEG): container finished" podID="431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" containerID="655d2ada972eeae271b8283a2c9a51210ec9ac61f0cc1398787b2ec6671b4a33" exitCode=0 Oct 05 20:18:10 crc kubenswrapper[4753]: I1005 20:18:10.655435 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwh7d" event={"ID":"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4","Type":"ContainerDied","Data":"655d2ada972eeae271b8283a2c9a51210ec9ac61f0cc1398787b2ec6671b4a33"} Oct 05 20:18:10 crc kubenswrapper[4753]: I1005 20:18:10.659669 4753 generic.go:334] "Generic (PLEG): container finished" podID="4c69edf5-c1fc-4392-bfb6-f52a114c4c19" containerID="5d89ec26f6e51c3c036fb6edb8ffdd385ae5f3f996b38e2748511e0eee8cb033" exitCode=0 Oct 05 20:18:10 crc kubenswrapper[4753]: I1005 20:18:10.659696 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xm5lr" event={"ID":"4c69edf5-c1fc-4392-bfb6-f52a114c4c19","Type":"ContainerDied","Data":"5d89ec26f6e51c3c036fb6edb8ffdd385ae5f3f996b38e2748511e0eee8cb033"} Oct 05 20:18:10 crc kubenswrapper[4753]: I1005 20:18:10.702615 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p4pcf" podStartSLOduration=2.879831133 podStartE2EDuration="46.702594539s" podCreationTimestamp="2025-10-05 20:17:24 +0000 UTC" firstStartedPulling="2025-10-05 20:17:26.226771881 +0000 UTC m=+155.075100103" lastFinishedPulling="2025-10-05 20:18:10.049535277 +0000 UTC m=+198.897863509" observedRunningTime="2025-10-05 20:18:10.701577886 +0000 UTC m=+199.549906118" watchObservedRunningTime="2025-10-05 20:18:10.702594539 +0000 UTC m=+199.550922771" Oct 05 20:18:11 crc kubenswrapper[4753]: I1005 20:18:11.668389 4753 generic.go:334] "Generic (PLEG): container finished" podID="a5e1c696-7a58-4427-a56d-6a774fa06532" containerID="fcbd0f1c06aaf989c5c2140a6cb9d75ed14084c9ff38d485159c25f46934f4b6" exitCode=0 Oct 05 20:18:11 crc kubenswrapper[4753]: I1005 20:18:11.668463 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtszk" event={"ID":"a5e1c696-7a58-4427-a56d-6a774fa06532","Type":"ContainerDied","Data":"fcbd0f1c06aaf989c5c2140a6cb9d75ed14084c9ff38d485159c25f46934f4b6"} Oct 05 20:18:12 crc kubenswrapper[4753]: I1005 20:18:12.144445 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:18:12 crc kubenswrapper[4753]: I1005 20:18:12.193148 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:18:12 crc kubenswrapper[4753]: I1005 20:18:12.200640 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:18:12 crc kubenswrapper[4753]: I1005 20:18:12.265712 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:18:12 crc kubenswrapper[4753]: I1005 20:18:12.676643 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwh7d" event={"ID":"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4","Type":"ContainerStarted","Data":"f7449c55e8f96e88395491531b7a87c218b6906ff0f7d79775475a2facd88619"} Oct 05 20:18:12 crc kubenswrapper[4753]: I1005 20:18:12.679353 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xm5lr" event={"ID":"4c69edf5-c1fc-4392-bfb6-f52a114c4c19","Type":"ContainerStarted","Data":"aa0529db90491c5d8251fb230746818308e73634d9803ed1708c70351ad98228"} Oct 05 20:18:12 crc kubenswrapper[4753]: I1005 20:18:12.681845 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hldsk" event={"ID":"befdf821-5bbf-4606-970d-ed88cf79993d","Type":"ContainerStarted","Data":"dbcd15ac257176a4d9bb910b1871ca0ec6384c2785a5f7e8aabfa2862fc40ed4"} Oct 05 20:18:12 crc kubenswrapper[4753]: I1005 20:18:12.684638 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtszk" event={"ID":"a5e1c696-7a58-4427-a56d-6a774fa06532","Type":"ContainerStarted","Data":"0d4d7e9e5c0bad606d6430a2c17474a37837220601acc76181af68c3085d51fe"} Oct 05 20:18:12 crc kubenswrapper[4753]: I1005 20:18:12.688624 4753 generic.go:334] "Generic (PLEG): container finished" podID="c0bfd881-720f-48b7-acdd-7bb4b3722471" containerID="90b78000311c5dfc217aa53af69be66582d9ef3da54e02bf8b9b468004bbeeef" exitCode=0 Oct 05 20:18:12 crc kubenswrapper[4753]: I1005 20:18:12.688693 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dx25" event={"ID":"c0bfd881-720f-48b7-acdd-7bb4b3722471","Type":"ContainerDied","Data":"90b78000311c5dfc217aa53af69be66582d9ef3da54e02bf8b9b468004bbeeef"} Oct 05 20:18:12 crc kubenswrapper[4753]: I1005 20:18:12.703494 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xwh7d" podStartSLOduration=3.019873989 podStartE2EDuration="48.703470383s" podCreationTimestamp="2025-10-05 20:17:24 +0000 UTC" firstStartedPulling="2025-10-05 20:17:26.226732419 +0000 UTC m=+155.075060651" lastFinishedPulling="2025-10-05 20:18:11.910328813 +0000 UTC m=+200.758657045" observedRunningTime="2025-10-05 20:18:12.701738336 +0000 UTC m=+201.550066558" watchObservedRunningTime="2025-10-05 20:18:12.703470383 +0000 UTC m=+201.551798615" Oct 05 20:18:12 crc kubenswrapper[4753]: I1005 20:18:12.752839 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mtszk" podStartSLOduration=3.591526939 podStartE2EDuration="51.752822873s" podCreationTimestamp="2025-10-05 20:17:21 +0000 UTC" firstStartedPulling="2025-10-05 20:17:24.124856565 +0000 UTC m=+152.973184797" lastFinishedPulling="2025-10-05 20:18:12.286152499 +0000 UTC m=+201.134480731" observedRunningTime="2025-10-05 20:18:12.747488476 +0000 UTC m=+201.595816708" watchObservedRunningTime="2025-10-05 20:18:12.752822873 +0000 UTC m=+201.601151095" Oct 05 20:18:12 crc kubenswrapper[4753]: I1005 20:18:12.769052 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hldsk" podStartSLOduration=2.9987328939999998 podStartE2EDuration="51.769037753s" podCreationTimestamp="2025-10-05 20:17:21 +0000 UTC" firstStartedPulling="2025-10-05 20:17:23.023606813 +0000 UTC m=+151.871935045" lastFinishedPulling="2025-10-05 20:18:11.793911672 +0000 UTC m=+200.642239904" observedRunningTime="2025-10-05 20:18:12.768767744 +0000 UTC m=+201.617095976" watchObservedRunningTime="2025-10-05 20:18:12.769037753 +0000 UTC m=+201.617365985" Oct 05 20:18:13 crc kubenswrapper[4753]: I1005 20:18:13.694716 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dx25" event={"ID":"c0bfd881-720f-48b7-acdd-7bb4b3722471","Type":"ContainerStarted","Data":"4f66640b4f7098b76ab1a3c6a168d9b695eecd8d4944eb31f51c5bc46089f979"} Oct 05 20:18:13 crc kubenswrapper[4753]: I1005 20:18:13.714540 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xm5lr" podStartSLOduration=4.967937183 podStartE2EDuration="50.714523637s" podCreationTimestamp="2025-10-05 20:17:23 +0000 UTC" firstStartedPulling="2025-10-05 20:17:26.150388445 +0000 UTC m=+154.998716677" lastFinishedPulling="2025-10-05 20:18:11.896974899 +0000 UTC m=+200.745303131" observedRunningTime="2025-10-05 20:18:12.79269989 +0000 UTC m=+201.641028122" watchObservedRunningTime="2025-10-05 20:18:13.714523637 +0000 UTC m=+202.562851869" Oct 05 20:18:13 crc kubenswrapper[4753]: I1005 20:18:13.716291 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7dx25" podStartSLOduration=3.854328036 podStartE2EDuration="50.716284896s" podCreationTimestamp="2025-10-05 20:17:23 +0000 UTC" firstStartedPulling="2025-10-05 20:17:26.2267471 +0000 UTC m=+155.075075322" lastFinishedPulling="2025-10-05 20:18:13.08870395 +0000 UTC m=+201.937032182" observedRunningTime="2025-10-05 20:18:13.714738194 +0000 UTC m=+202.563066426" watchObservedRunningTime="2025-10-05 20:18:13.716284896 +0000 UTC m=+202.564613128" Oct 05 20:18:13 crc kubenswrapper[4753]: I1005 20:18:13.914573 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:18:13 crc kubenswrapper[4753]: I1005 20:18:13.914717 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:18:14 crc kubenswrapper[4753]: I1005 20:18:14.322203 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:18:14 crc kubenswrapper[4753]: I1005 20:18:14.322771 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:18:14 crc kubenswrapper[4753]: I1005 20:18:14.372728 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:18:14 crc kubenswrapper[4753]: I1005 20:18:14.924991 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:18:14 crc kubenswrapper[4753]: I1005 20:18:14.925155 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:18:14 crc kubenswrapper[4753]: I1005 20:18:14.959764 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-7dx25" podUID="c0bfd881-720f-48b7-acdd-7bb4b3722471" containerName="registry-server" probeResult="failure" output=< Oct 05 20:18:14 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 20:18:14 crc kubenswrapper[4753]: > Oct 05 20:18:15 crc kubenswrapper[4753]: I1005 20:18:15.331918 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:18:15 crc kubenswrapper[4753]: I1005 20:18:15.331965 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:18:16 crc kubenswrapper[4753]: I1005 20:18:16.026075 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xwh7d" podUID="431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" containerName="registry-server" probeResult="failure" output=< Oct 05 20:18:16 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 20:18:16 crc kubenswrapper[4753]: > Oct 05 20:18:16 crc kubenswrapper[4753]: I1005 20:18:16.379798 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p4pcf" podUID="7329a47a-c470-4d01-92c6-9addeedeb6e1" containerName="registry-server" probeResult="failure" output=< Oct 05 20:18:16 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 20:18:16 crc kubenswrapper[4753]: > Oct 05 20:18:16 crc kubenswrapper[4753]: I1005 20:18:16.479909 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-56tnl"] Oct 05 20:18:16 crc kubenswrapper[4753]: I1005 20:18:16.480194 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-56tnl" podUID="ded39b3c-186e-4798-a5e3-eefcef9ebd41" containerName="registry-server" containerID="cri-o://669c16fa7de7657d438af283f119f6b5b44b1eb3aba266d078c34f7b14aedf58" gracePeriod=2 Oct 05 20:18:17 crc kubenswrapper[4753]: I1005 20:18:17.719975 4753 generic.go:334] "Generic (PLEG): container finished" podID="ded39b3c-186e-4798-a5e3-eefcef9ebd41" containerID="669c16fa7de7657d438af283f119f6b5b44b1eb3aba266d078c34f7b14aedf58" exitCode=0 Oct 05 20:18:17 crc kubenswrapper[4753]: I1005 20:18:17.720192 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56tnl" event={"ID":"ded39b3c-186e-4798-a5e3-eefcef9ebd41","Type":"ContainerDied","Data":"669c16fa7de7657d438af283f119f6b5b44b1eb3aba266d078c34f7b14aedf58"} Oct 05 20:18:17 crc kubenswrapper[4753]: I1005 20:18:17.720356 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56tnl" event={"ID":"ded39b3c-186e-4798-a5e3-eefcef9ebd41","Type":"ContainerDied","Data":"20915f74f3f675a0c984ddf6ccabab545786f1c34f0730ed31c2d6cca9f9b603"} Oct 05 20:18:17 crc kubenswrapper[4753]: I1005 20:18:17.720372 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20915f74f3f675a0c984ddf6ccabab545786f1c34f0730ed31c2d6cca9f9b603" Oct 05 20:18:17 crc kubenswrapper[4753]: I1005 20:18:17.727747 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:18:17 crc kubenswrapper[4753]: I1005 20:18:17.817914 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ded39b3c-186e-4798-a5e3-eefcef9ebd41-catalog-content\") pod \"ded39b3c-186e-4798-a5e3-eefcef9ebd41\" (UID: \"ded39b3c-186e-4798-a5e3-eefcef9ebd41\") " Oct 05 20:18:17 crc kubenswrapper[4753]: I1005 20:18:17.817951 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ded39b3c-186e-4798-a5e3-eefcef9ebd41-utilities\") pod \"ded39b3c-186e-4798-a5e3-eefcef9ebd41\" (UID: \"ded39b3c-186e-4798-a5e3-eefcef9ebd41\") " Oct 05 20:18:17 crc kubenswrapper[4753]: I1005 20:18:17.817989 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l85x\" (UniqueName: \"kubernetes.io/projected/ded39b3c-186e-4798-a5e3-eefcef9ebd41-kube-api-access-7l85x\") pod \"ded39b3c-186e-4798-a5e3-eefcef9ebd41\" (UID: \"ded39b3c-186e-4798-a5e3-eefcef9ebd41\") " Oct 05 20:18:17 crc kubenswrapper[4753]: I1005 20:18:17.819529 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ded39b3c-186e-4798-a5e3-eefcef9ebd41-utilities" (OuterVolumeSpecName: "utilities") pod "ded39b3c-186e-4798-a5e3-eefcef9ebd41" (UID: "ded39b3c-186e-4798-a5e3-eefcef9ebd41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:18:17 crc kubenswrapper[4753]: I1005 20:18:17.823882 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ded39b3c-186e-4798-a5e3-eefcef9ebd41-kube-api-access-7l85x" (OuterVolumeSpecName: "kube-api-access-7l85x") pod "ded39b3c-186e-4798-a5e3-eefcef9ebd41" (UID: "ded39b3c-186e-4798-a5e3-eefcef9ebd41"). InnerVolumeSpecName "kube-api-access-7l85x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:18:17 crc kubenswrapper[4753]: I1005 20:18:17.865479 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ded39b3c-186e-4798-a5e3-eefcef9ebd41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ded39b3c-186e-4798-a5e3-eefcef9ebd41" (UID: "ded39b3c-186e-4798-a5e3-eefcef9ebd41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:18:17 crc kubenswrapper[4753]: I1005 20:18:17.918993 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ded39b3c-186e-4798-a5e3-eefcef9ebd41-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:17 crc kubenswrapper[4753]: I1005 20:18:17.919033 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ded39b3c-186e-4798-a5e3-eefcef9ebd41-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:17 crc kubenswrapper[4753]: I1005 20:18:17.919049 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7l85x\" (UniqueName: \"kubernetes.io/projected/ded39b3c-186e-4798-a5e3-eefcef9ebd41-kube-api-access-7l85x\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:18 crc kubenswrapper[4753]: I1005 20:18:18.725301 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-56tnl" Oct 05 20:18:18 crc kubenswrapper[4753]: I1005 20:18:18.739667 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-56tnl"] Oct 05 20:18:18 crc kubenswrapper[4753]: I1005 20:18:18.745873 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-56tnl"] Oct 05 20:18:19 crc kubenswrapper[4753]: I1005 20:18:19.859275 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ded39b3c-186e-4798-a5e3-eefcef9ebd41" path="/var/lib/kubelet/pods/ded39b3c-186e-4798-a5e3-eefcef9ebd41/volumes" Oct 05 20:18:21 crc kubenswrapper[4753]: I1005 20:18:21.949357 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:18:21 crc kubenswrapper[4753]: I1005 20:18:21.949688 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:18:21 crc kubenswrapper[4753]: I1005 20:18:21.993307 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:18:22 crc kubenswrapper[4753]: I1005 20:18:22.368081 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:18:22 crc kubenswrapper[4753]: I1005 20:18:22.368167 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:18:22 crc kubenswrapper[4753]: I1005 20:18:22.402850 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:18:22 crc kubenswrapper[4753]: I1005 20:18:22.786005 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:18:22 crc kubenswrapper[4753]: I1005 20:18:22.787703 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:18:23 crc kubenswrapper[4753]: I1005 20:18:23.281268 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mtszk"] Oct 05 20:18:23 crc kubenswrapper[4753]: I1005 20:18:23.957004 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:18:23 crc kubenswrapper[4753]: I1005 20:18:23.996667 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:18:24 crc kubenswrapper[4753]: I1005 20:18:24.358741 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:18:24 crc kubenswrapper[4753]: I1005 20:18:24.759012 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mtszk" podUID="a5e1c696-7a58-4427-a56d-6a774fa06532" containerName="registry-server" containerID="cri-o://0d4d7e9e5c0bad606d6430a2c17474a37837220601acc76181af68c3085d51fe" gracePeriod=2 Oct 05 20:18:24 crc kubenswrapper[4753]: I1005 20:18:24.994080 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.044891 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.164409 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.319036 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e1c696-7a58-4427-a56d-6a774fa06532-utilities\") pod \"a5e1c696-7a58-4427-a56d-6a774fa06532\" (UID: \"a5e1c696-7a58-4427-a56d-6a774fa06532\") " Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.319196 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n92lx\" (UniqueName: \"kubernetes.io/projected/a5e1c696-7a58-4427-a56d-6a774fa06532-kube-api-access-n92lx\") pod \"a5e1c696-7a58-4427-a56d-6a774fa06532\" (UID: \"a5e1c696-7a58-4427-a56d-6a774fa06532\") " Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.320209 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e1c696-7a58-4427-a56d-6a774fa06532-utilities" (OuterVolumeSpecName: "utilities") pod "a5e1c696-7a58-4427-a56d-6a774fa06532" (UID: "a5e1c696-7a58-4427-a56d-6a774fa06532"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.320499 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e1c696-7a58-4427-a56d-6a774fa06532-catalog-content\") pod \"a5e1c696-7a58-4427-a56d-6a774fa06532\" (UID: \"a5e1c696-7a58-4427-a56d-6a774fa06532\") " Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.320763 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e1c696-7a58-4427-a56d-6a774fa06532-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.339366 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e1c696-7a58-4427-a56d-6a774fa06532-kube-api-access-n92lx" (OuterVolumeSpecName: "kube-api-access-n92lx") pod "a5e1c696-7a58-4427-a56d-6a774fa06532" (UID: "a5e1c696-7a58-4427-a56d-6a774fa06532"). InnerVolumeSpecName "kube-api-access-n92lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.366259 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e1c696-7a58-4427-a56d-6a774fa06532-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5e1c696-7a58-4427-a56d-6a774fa06532" (UID: "a5e1c696-7a58-4427-a56d-6a774fa06532"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.379575 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.422273 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n92lx\" (UniqueName: \"kubernetes.io/projected/a5e1c696-7a58-4427-a56d-6a774fa06532-kube-api-access-n92lx\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.422325 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e1c696-7a58-4427-a56d-6a774fa06532-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.439486 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.767695 4753 generic.go:334] "Generic (PLEG): container finished" podID="a5e1c696-7a58-4427-a56d-6a774fa06532" containerID="0d4d7e9e5c0bad606d6430a2c17474a37837220601acc76181af68c3085d51fe" exitCode=0 Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.767749 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtszk" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.767771 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtszk" event={"ID":"a5e1c696-7a58-4427-a56d-6a774fa06532","Type":"ContainerDied","Data":"0d4d7e9e5c0bad606d6430a2c17474a37837220601acc76181af68c3085d51fe"} Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.768098 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtszk" event={"ID":"a5e1c696-7a58-4427-a56d-6a774fa06532","Type":"ContainerDied","Data":"5730dcc1b1a0394cb740404fa6f00225ed1eaa5374e461477718e6db0aec61dc"} Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.768121 4753 scope.go:117] "RemoveContainer" containerID="0d4d7e9e5c0bad606d6430a2c17474a37837220601acc76181af68c3085d51fe" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.806446 4753 scope.go:117] "RemoveContainer" containerID="fcbd0f1c06aaf989c5c2140a6cb9d75ed14084c9ff38d485159c25f46934f4b6" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.809943 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mtszk"] Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.815999 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mtszk"] Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.824581 4753 scope.go:117] "RemoveContainer" containerID="8c41c5bb8f4bddf25f220b953e565d5e1954e77e01ff6ac7aef105ce5df845cd" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.850966 4753 scope.go:117] "RemoveContainer" containerID="0d4d7e9e5c0bad606d6430a2c17474a37837220601acc76181af68c3085d51fe" Oct 05 20:18:25 crc kubenswrapper[4753]: E1005 20:18:25.851581 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d4d7e9e5c0bad606d6430a2c17474a37837220601acc76181af68c3085d51fe\": container with ID starting with 0d4d7e9e5c0bad606d6430a2c17474a37837220601acc76181af68c3085d51fe not found: ID does not exist" containerID="0d4d7e9e5c0bad606d6430a2c17474a37837220601acc76181af68c3085d51fe" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.851652 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d4d7e9e5c0bad606d6430a2c17474a37837220601acc76181af68c3085d51fe"} err="failed to get container status \"0d4d7e9e5c0bad606d6430a2c17474a37837220601acc76181af68c3085d51fe\": rpc error: code = NotFound desc = could not find container \"0d4d7e9e5c0bad606d6430a2c17474a37837220601acc76181af68c3085d51fe\": container with ID starting with 0d4d7e9e5c0bad606d6430a2c17474a37837220601acc76181af68c3085d51fe not found: ID does not exist" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.851746 4753 scope.go:117] "RemoveContainer" containerID="fcbd0f1c06aaf989c5c2140a6cb9d75ed14084c9ff38d485159c25f46934f4b6" Oct 05 20:18:25 crc kubenswrapper[4753]: E1005 20:18:25.852643 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcbd0f1c06aaf989c5c2140a6cb9d75ed14084c9ff38d485159c25f46934f4b6\": container with ID starting with fcbd0f1c06aaf989c5c2140a6cb9d75ed14084c9ff38d485159c25f46934f4b6 not found: ID does not exist" containerID="fcbd0f1c06aaf989c5c2140a6cb9d75ed14084c9ff38d485159c25f46934f4b6" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.852689 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcbd0f1c06aaf989c5c2140a6cb9d75ed14084c9ff38d485159c25f46934f4b6"} err="failed to get container status \"fcbd0f1c06aaf989c5c2140a6cb9d75ed14084c9ff38d485159c25f46934f4b6\": rpc error: code = NotFound desc = could not find container \"fcbd0f1c06aaf989c5c2140a6cb9d75ed14084c9ff38d485159c25f46934f4b6\": container with ID starting with fcbd0f1c06aaf989c5c2140a6cb9d75ed14084c9ff38d485159c25f46934f4b6 not found: ID does not exist" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.852710 4753 scope.go:117] "RemoveContainer" containerID="8c41c5bb8f4bddf25f220b953e565d5e1954e77e01ff6ac7aef105ce5df845cd" Oct 05 20:18:25 crc kubenswrapper[4753]: E1005 20:18:25.853423 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c41c5bb8f4bddf25f220b953e565d5e1954e77e01ff6ac7aef105ce5df845cd\": container with ID starting with 8c41c5bb8f4bddf25f220b953e565d5e1954e77e01ff6ac7aef105ce5df845cd not found: ID does not exist" containerID="8c41c5bb8f4bddf25f220b953e565d5e1954e77e01ff6ac7aef105ce5df845cd" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.853461 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c41c5bb8f4bddf25f220b953e565d5e1954e77e01ff6ac7aef105ce5df845cd"} err="failed to get container status \"8c41c5bb8f4bddf25f220b953e565d5e1954e77e01ff6ac7aef105ce5df845cd\": rpc error: code = NotFound desc = could not find container \"8c41c5bb8f4bddf25f220b953e565d5e1954e77e01ff6ac7aef105ce5df845cd\": container with ID starting with 8c41c5bb8f4bddf25f220b953e565d5e1954e77e01ff6ac7aef105ce5df845cd not found: ID does not exist" Oct 05 20:18:25 crc kubenswrapper[4753]: I1005 20:18:25.863046 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e1c696-7a58-4427-a56d-6a774fa06532" path="/var/lib/kubelet/pods/a5e1c696-7a58-4427-a56d-6a774fa06532/volumes" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.281880 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xm5lr"] Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.282265 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xm5lr" podUID="4c69edf5-c1fc-4392-bfb6-f52a114c4c19" containerName="registry-server" containerID="cri-o://aa0529db90491c5d8251fb230746818308e73634d9803ed1708c70351ad98228" gracePeriod=2 Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.601576 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.739642 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-utilities\") pod \"4c69edf5-c1fc-4392-bfb6-f52a114c4c19\" (UID: \"4c69edf5-c1fc-4392-bfb6-f52a114c4c19\") " Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.739741 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-catalog-content\") pod \"4c69edf5-c1fc-4392-bfb6-f52a114c4c19\" (UID: \"4c69edf5-c1fc-4392-bfb6-f52a114c4c19\") " Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.739786 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jhdn\" (UniqueName: \"kubernetes.io/projected/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-kube-api-access-8jhdn\") pod \"4c69edf5-c1fc-4392-bfb6-f52a114c4c19\" (UID: \"4c69edf5-c1fc-4392-bfb6-f52a114c4c19\") " Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.740835 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-utilities" (OuterVolumeSpecName: "utilities") pod "4c69edf5-c1fc-4392-bfb6-f52a114c4c19" (UID: "4c69edf5-c1fc-4392-bfb6-f52a114c4c19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.744288 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-kube-api-access-8jhdn" (OuterVolumeSpecName: "kube-api-access-8jhdn") pod "4c69edf5-c1fc-4392-bfb6-f52a114c4c19" (UID: "4c69edf5-c1fc-4392-bfb6-f52a114c4c19"). InnerVolumeSpecName "kube-api-access-8jhdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.753041 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c69edf5-c1fc-4392-bfb6-f52a114c4c19" (UID: "4c69edf5-c1fc-4392-bfb6-f52a114c4c19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.774561 4753 generic.go:334] "Generic (PLEG): container finished" podID="4c69edf5-c1fc-4392-bfb6-f52a114c4c19" containerID="aa0529db90491c5d8251fb230746818308e73634d9803ed1708c70351ad98228" exitCode=0 Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.774596 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xm5lr" event={"ID":"4c69edf5-c1fc-4392-bfb6-f52a114c4c19","Type":"ContainerDied","Data":"aa0529db90491c5d8251fb230746818308e73634d9803ed1708c70351ad98228"} Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.774618 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xm5lr" event={"ID":"4c69edf5-c1fc-4392-bfb6-f52a114c4c19","Type":"ContainerDied","Data":"e1c169d5d583b10fc701772466b1dc306963baba269084b91138d864d8060790"} Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.774634 4753 scope.go:117] "RemoveContainer" containerID="aa0529db90491c5d8251fb230746818308e73634d9803ed1708c70351ad98228" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.774712 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xm5lr" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.795236 4753 scope.go:117] "RemoveContainer" containerID="5d89ec26f6e51c3c036fb6edb8ffdd385ae5f3f996b38e2748511e0eee8cb033" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.808303 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xm5lr"] Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.811256 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xm5lr"] Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.831870 4753 scope.go:117] "RemoveContainer" containerID="1f42c035479a7261e2815c3f19cee3e5b35d6f24911688232728da15dd0375dd" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.840827 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.840881 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jhdn\" (UniqueName: \"kubernetes.io/projected/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-kube-api-access-8jhdn\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.840893 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c69edf5-c1fc-4392-bfb6-f52a114c4c19-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.841630 4753 scope.go:117] "RemoveContainer" containerID="aa0529db90491c5d8251fb230746818308e73634d9803ed1708c70351ad98228" Oct 05 20:18:26 crc kubenswrapper[4753]: E1005 20:18:26.841920 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa0529db90491c5d8251fb230746818308e73634d9803ed1708c70351ad98228\": container with ID starting with aa0529db90491c5d8251fb230746818308e73634d9803ed1708c70351ad98228 not found: ID does not exist" containerID="aa0529db90491c5d8251fb230746818308e73634d9803ed1708c70351ad98228" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.841949 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0529db90491c5d8251fb230746818308e73634d9803ed1708c70351ad98228"} err="failed to get container status \"aa0529db90491c5d8251fb230746818308e73634d9803ed1708c70351ad98228\": rpc error: code = NotFound desc = could not find container \"aa0529db90491c5d8251fb230746818308e73634d9803ed1708c70351ad98228\": container with ID starting with aa0529db90491c5d8251fb230746818308e73634d9803ed1708c70351ad98228 not found: ID does not exist" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.841969 4753 scope.go:117] "RemoveContainer" containerID="5d89ec26f6e51c3c036fb6edb8ffdd385ae5f3f996b38e2748511e0eee8cb033" Oct 05 20:18:26 crc kubenswrapper[4753]: E1005 20:18:26.842283 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d89ec26f6e51c3c036fb6edb8ffdd385ae5f3f996b38e2748511e0eee8cb033\": container with ID starting with 5d89ec26f6e51c3c036fb6edb8ffdd385ae5f3f996b38e2748511e0eee8cb033 not found: ID does not exist" containerID="5d89ec26f6e51c3c036fb6edb8ffdd385ae5f3f996b38e2748511e0eee8cb033" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.842311 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d89ec26f6e51c3c036fb6edb8ffdd385ae5f3f996b38e2748511e0eee8cb033"} err="failed to get container status \"5d89ec26f6e51c3c036fb6edb8ffdd385ae5f3f996b38e2748511e0eee8cb033\": rpc error: code = NotFound desc = could not find container \"5d89ec26f6e51c3c036fb6edb8ffdd385ae5f3f996b38e2748511e0eee8cb033\": container with ID starting with 5d89ec26f6e51c3c036fb6edb8ffdd385ae5f3f996b38e2748511e0eee8cb033 not found: ID does not exist" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.842332 4753 scope.go:117] "RemoveContainer" containerID="1f42c035479a7261e2815c3f19cee3e5b35d6f24911688232728da15dd0375dd" Oct 05 20:18:26 crc kubenswrapper[4753]: E1005 20:18:26.842571 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f42c035479a7261e2815c3f19cee3e5b35d6f24911688232728da15dd0375dd\": container with ID starting with 1f42c035479a7261e2815c3f19cee3e5b35d6f24911688232728da15dd0375dd not found: ID does not exist" containerID="1f42c035479a7261e2815c3f19cee3e5b35d6f24911688232728da15dd0375dd" Oct 05 20:18:26 crc kubenswrapper[4753]: I1005 20:18:26.842594 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f42c035479a7261e2815c3f19cee3e5b35d6f24911688232728da15dd0375dd"} err="failed to get container status \"1f42c035479a7261e2815c3f19cee3e5b35d6f24911688232728da15dd0375dd\": rpc error: code = NotFound desc = could not find container \"1f42c035479a7261e2815c3f19cee3e5b35d6f24911688232728da15dd0375dd\": container with ID starting with 1f42c035479a7261e2815c3f19cee3e5b35d6f24911688232728da15dd0375dd not found: ID does not exist" Oct 05 20:18:27 crc kubenswrapper[4753]: I1005 20:18:27.857962 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c69edf5-c1fc-4392-bfb6-f52a114c4c19" path="/var/lib/kubelet/pods/4c69edf5-c1fc-4392-bfb6-f52a114c4c19/volumes" Oct 05 20:18:28 crc kubenswrapper[4753]: I1005 20:18:28.682173 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p4pcf"] Oct 05 20:18:28 crc kubenswrapper[4753]: I1005 20:18:28.682529 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p4pcf" podUID="7329a47a-c470-4d01-92c6-9addeedeb6e1" containerName="registry-server" containerID="cri-o://1cf0adcba6ebc2cae6c6f3ce12d02988e244c8e27251f1a5e241316974df7fac" gracePeriod=2 Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.024153 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.196831 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7329a47a-c470-4d01-92c6-9addeedeb6e1-utilities\") pod \"7329a47a-c470-4d01-92c6-9addeedeb6e1\" (UID: \"7329a47a-c470-4d01-92c6-9addeedeb6e1\") " Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.197020 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6lrf\" (UniqueName: \"kubernetes.io/projected/7329a47a-c470-4d01-92c6-9addeedeb6e1-kube-api-access-n6lrf\") pod \"7329a47a-c470-4d01-92c6-9addeedeb6e1\" (UID: \"7329a47a-c470-4d01-92c6-9addeedeb6e1\") " Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.197095 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7329a47a-c470-4d01-92c6-9addeedeb6e1-catalog-content\") pod \"7329a47a-c470-4d01-92c6-9addeedeb6e1\" (UID: \"7329a47a-c470-4d01-92c6-9addeedeb6e1\") " Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.201440 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7329a47a-c470-4d01-92c6-9addeedeb6e1-kube-api-access-n6lrf" (OuterVolumeSpecName: "kube-api-access-n6lrf") pod "7329a47a-c470-4d01-92c6-9addeedeb6e1" (UID: "7329a47a-c470-4d01-92c6-9addeedeb6e1"). InnerVolumeSpecName "kube-api-access-n6lrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.204116 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7329a47a-c470-4d01-92c6-9addeedeb6e1-utilities" (OuterVolumeSpecName: "utilities") pod "7329a47a-c470-4d01-92c6-9addeedeb6e1" (UID: "7329a47a-c470-4d01-92c6-9addeedeb6e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.278796 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7329a47a-c470-4d01-92c6-9addeedeb6e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7329a47a-c470-4d01-92c6-9addeedeb6e1" (UID: "7329a47a-c470-4d01-92c6-9addeedeb6e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.298810 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6lrf\" (UniqueName: \"kubernetes.io/projected/7329a47a-c470-4d01-92c6-9addeedeb6e1-kube-api-access-n6lrf\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.298846 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7329a47a-c470-4d01-92c6-9addeedeb6e1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.298860 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7329a47a-c470-4d01-92c6-9addeedeb6e1-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.793590 4753 generic.go:334] "Generic (PLEG): container finished" podID="7329a47a-c470-4d01-92c6-9addeedeb6e1" containerID="1cf0adcba6ebc2cae6c6f3ce12d02988e244c8e27251f1a5e241316974df7fac" exitCode=0 Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.793662 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4pcf" event={"ID":"7329a47a-c470-4d01-92c6-9addeedeb6e1","Type":"ContainerDied","Data":"1cf0adcba6ebc2cae6c6f3ce12d02988e244c8e27251f1a5e241316974df7fac"} Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.793686 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4pcf" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.794592 4753 scope.go:117] "RemoveContainer" containerID="1cf0adcba6ebc2cae6c6f3ce12d02988e244c8e27251f1a5e241316974df7fac" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.794492 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4pcf" event={"ID":"7329a47a-c470-4d01-92c6-9addeedeb6e1","Type":"ContainerDied","Data":"c6910f0c3cf360bd92c10b9bd640ec3de7251df87d6fcb4eda6e8012c57f3cf3"} Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.815381 4753 scope.go:117] "RemoveContainer" containerID="74e0e6a70bb71499e35344d38e15b98160f40f414ff0b15d82654ba5d901cf90" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.826698 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p4pcf"] Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.835787 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p4pcf"] Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.837849 4753 scope.go:117] "RemoveContainer" containerID="7cc46185e644bf4f2ecde44beb89545766161fafd57e9d5f6ed19ed96682270d" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.856842 4753 scope.go:117] "RemoveContainer" containerID="1cf0adcba6ebc2cae6c6f3ce12d02988e244c8e27251f1a5e241316974df7fac" Oct 05 20:18:29 crc kubenswrapper[4753]: E1005 20:18:29.857578 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cf0adcba6ebc2cae6c6f3ce12d02988e244c8e27251f1a5e241316974df7fac\": container with ID starting with 1cf0adcba6ebc2cae6c6f3ce12d02988e244c8e27251f1a5e241316974df7fac not found: ID does not exist" containerID="1cf0adcba6ebc2cae6c6f3ce12d02988e244c8e27251f1a5e241316974df7fac" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.857614 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf0adcba6ebc2cae6c6f3ce12d02988e244c8e27251f1a5e241316974df7fac"} err="failed to get container status \"1cf0adcba6ebc2cae6c6f3ce12d02988e244c8e27251f1a5e241316974df7fac\": rpc error: code = NotFound desc = could not find container \"1cf0adcba6ebc2cae6c6f3ce12d02988e244c8e27251f1a5e241316974df7fac\": container with ID starting with 1cf0adcba6ebc2cae6c6f3ce12d02988e244c8e27251f1a5e241316974df7fac not found: ID does not exist" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.857638 4753 scope.go:117] "RemoveContainer" containerID="74e0e6a70bb71499e35344d38e15b98160f40f414ff0b15d82654ba5d901cf90" Oct 05 20:18:29 crc kubenswrapper[4753]: E1005 20:18:29.858246 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74e0e6a70bb71499e35344d38e15b98160f40f414ff0b15d82654ba5d901cf90\": container with ID starting with 74e0e6a70bb71499e35344d38e15b98160f40f414ff0b15d82654ba5d901cf90 not found: ID does not exist" containerID="74e0e6a70bb71499e35344d38e15b98160f40f414ff0b15d82654ba5d901cf90" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.858312 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e0e6a70bb71499e35344d38e15b98160f40f414ff0b15d82654ba5d901cf90"} err="failed to get container status \"74e0e6a70bb71499e35344d38e15b98160f40f414ff0b15d82654ba5d901cf90\": rpc error: code = NotFound desc = could not find container \"74e0e6a70bb71499e35344d38e15b98160f40f414ff0b15d82654ba5d901cf90\": container with ID starting with 74e0e6a70bb71499e35344d38e15b98160f40f414ff0b15d82654ba5d901cf90 not found: ID does not exist" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.858362 4753 scope.go:117] "RemoveContainer" containerID="7cc46185e644bf4f2ecde44beb89545766161fafd57e9d5f6ed19ed96682270d" Oct 05 20:18:29 crc kubenswrapper[4753]: E1005 20:18:29.858926 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc46185e644bf4f2ecde44beb89545766161fafd57e9d5f6ed19ed96682270d\": container with ID starting with 7cc46185e644bf4f2ecde44beb89545766161fafd57e9d5f6ed19ed96682270d not found: ID does not exist" containerID="7cc46185e644bf4f2ecde44beb89545766161fafd57e9d5f6ed19ed96682270d" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.858967 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc46185e644bf4f2ecde44beb89545766161fafd57e9d5f6ed19ed96682270d"} err="failed to get container status \"7cc46185e644bf4f2ecde44beb89545766161fafd57e9d5f6ed19ed96682270d\": rpc error: code = NotFound desc = could not find container \"7cc46185e644bf4f2ecde44beb89545766161fafd57e9d5f6ed19ed96682270d\": container with ID starting with 7cc46185e644bf4f2ecde44beb89545766161fafd57e9d5f6ed19ed96682270d not found: ID does not exist" Oct 05 20:18:29 crc kubenswrapper[4753]: I1005 20:18:29.865298 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7329a47a-c470-4d01-92c6-9addeedeb6e1" path="/var/lib/kubelet/pods/7329a47a-c470-4d01-92c6-9addeedeb6e1/volumes" Oct 05 20:18:31 crc kubenswrapper[4753]: I1005 20:18:31.644201 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tpd8r"] Oct 05 20:18:34 crc kubenswrapper[4753]: I1005 20:18:34.489982 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:18:34 crc kubenswrapper[4753]: I1005 20:18:34.490344 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:18:34 crc kubenswrapper[4753]: I1005 20:18:34.490406 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:18:34 crc kubenswrapper[4753]: I1005 20:18:34.491304 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 20:18:34 crc kubenswrapper[4753]: I1005 20:18:34.491405 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef" gracePeriod=600 Oct 05 20:18:34 crc kubenswrapper[4753]: I1005 20:18:34.822704 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef" exitCode=0 Oct 05 20:18:34 crc kubenswrapper[4753]: I1005 20:18:34.822978 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef"} Oct 05 20:18:34 crc kubenswrapper[4753]: I1005 20:18:34.823002 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"d036ff9dbc389fc79b9fd8e46b1fb2f4c9ac8b14987017b53068ab8b2e6b56cb"} Oct 05 20:18:56 crc kubenswrapper[4753]: I1005 20:18:56.689689 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" podUID="a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" containerName="oauth-openshift" containerID="cri-o://77d0d6b1bf87cd470462bd298023aa406ffc4be5c01b9aa6c67a5b9228179b40" gracePeriod=15 Oct 05 20:18:56 crc kubenswrapper[4753]: I1005 20:18:56.979712 4753 generic.go:334] "Generic (PLEG): container finished" podID="a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" containerID="77d0d6b1bf87cd470462bd298023aa406ffc4be5c01b9aa6c67a5b9228179b40" exitCode=0 Oct 05 20:18:56 crc kubenswrapper[4753]: I1005 20:18:56.980369 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" event={"ID":"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2","Type":"ContainerDied","Data":"77d0d6b1bf87cd470462bd298023aa406ffc4be5c01b9aa6c67a5b9228179b40"} Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.153029 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.170123 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-idp-0-file-data\") pod \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.170182 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-error\") pod \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.170200 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-session\") pod \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.170226 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-audit-policies\") pod \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.170243 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-login\") pod \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.170271 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65qg2\" (UniqueName: \"kubernetes.io/projected/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-kube-api-access-65qg2\") pod \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.170298 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-router-certs\") pod \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.170320 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-audit-dir\") pod \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.170340 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-serving-cert\") pod \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.170362 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-ocp-branding-template\") pod \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.170380 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-provider-selection\") pod \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.170398 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-service-ca\") pod \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.170417 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-trusted-ca-bundle\") pod \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.170440 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-cliconfig\") pod \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\" (UID: \"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2\") " Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.171434 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" (UID: "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.171814 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" (UID: "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.173066 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" (UID: "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.174036 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" (UID: "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.176242 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" (UID: "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.208502 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-95988dd86-wqvmv"] Oct 05 20:18:57 crc kubenswrapper[4753]: E1005 20:18:57.210497 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded39b3c-186e-4798-a5e3-eefcef9ebd41" containerName="extract-utilities" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.210534 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded39b3c-186e-4798-a5e3-eefcef9ebd41" containerName="extract-utilities" Oct 05 20:18:57 crc kubenswrapper[4753]: E1005 20:18:57.210553 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="051fe433-d2b7-4c42-8b42-ce0bd8795b58" containerName="pruner" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.210564 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="051fe433-d2b7-4c42-8b42-ce0bd8795b58" containerName="pruner" Oct 05 20:18:57 crc kubenswrapper[4753]: E1005 20:18:57.210575 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded39b3c-186e-4798-a5e3-eefcef9ebd41" containerName="registry-server" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.210585 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded39b3c-186e-4798-a5e3-eefcef9ebd41" containerName="registry-server" Oct 05 20:18:57 crc kubenswrapper[4753]: E1005 20:18:57.210602 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded39b3c-186e-4798-a5e3-eefcef9ebd41" containerName="extract-content" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.210611 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded39b3c-186e-4798-a5e3-eefcef9ebd41" containerName="extract-content" Oct 05 20:18:57 crc kubenswrapper[4753]: E1005 20:18:57.210622 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e1c696-7a58-4427-a56d-6a774fa06532" containerName="extract-utilities" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.210632 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e1c696-7a58-4427-a56d-6a774fa06532" containerName="extract-utilities" Oct 05 20:18:57 crc kubenswrapper[4753]: E1005 20:18:57.210653 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7329a47a-c470-4d01-92c6-9addeedeb6e1" containerName="registry-server" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.210664 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7329a47a-c470-4d01-92c6-9addeedeb6e1" containerName="registry-server" Oct 05 20:18:57 crc kubenswrapper[4753]: E1005 20:18:57.210679 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e1c696-7a58-4427-a56d-6a774fa06532" containerName="registry-server" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.210692 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e1c696-7a58-4427-a56d-6a774fa06532" containerName="registry-server" Oct 05 20:18:57 crc kubenswrapper[4753]: E1005 20:18:57.210703 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7329a47a-c470-4d01-92c6-9addeedeb6e1" containerName="extract-content" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.210731 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7329a47a-c470-4d01-92c6-9addeedeb6e1" containerName="extract-content" Oct 05 20:18:57 crc kubenswrapper[4753]: E1005 20:18:57.210744 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e1c696-7a58-4427-a56d-6a774fa06532" containerName="extract-content" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.210753 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e1c696-7a58-4427-a56d-6a774fa06532" containerName="extract-content" Oct 05 20:18:57 crc kubenswrapper[4753]: E1005 20:18:57.210766 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" containerName="oauth-openshift" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.210776 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" containerName="oauth-openshift" Oct 05 20:18:57 crc kubenswrapper[4753]: E1005 20:18:57.210793 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c69edf5-c1fc-4392-bfb6-f52a114c4c19" containerName="extract-utilities" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.210803 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c69edf5-c1fc-4392-bfb6-f52a114c4c19" containerName="extract-utilities" Oct 05 20:18:57 crc kubenswrapper[4753]: E1005 20:18:57.210821 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7329a47a-c470-4d01-92c6-9addeedeb6e1" containerName="extract-utilities" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.210830 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7329a47a-c470-4d01-92c6-9addeedeb6e1" containerName="extract-utilities" Oct 05 20:18:57 crc kubenswrapper[4753]: E1005 20:18:57.210846 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c69edf5-c1fc-4392-bfb6-f52a114c4c19" containerName="registry-server" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.210858 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c69edf5-c1fc-4392-bfb6-f52a114c4c19" containerName="registry-server" Oct 05 20:18:57 crc kubenswrapper[4753]: E1005 20:18:57.210879 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c69edf5-c1fc-4392-bfb6-f52a114c4c19" containerName="extract-content" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.210893 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c69edf5-c1fc-4392-bfb6-f52a114c4c19" containerName="extract-content" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.211019 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" containerName="oauth-openshift" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.211033 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7329a47a-c470-4d01-92c6-9addeedeb6e1" containerName="registry-server" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.211049 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="051fe433-d2b7-4c42-8b42-ce0bd8795b58" containerName="pruner" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.211063 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c69edf5-c1fc-4392-bfb6-f52a114c4c19" containerName="registry-server" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.211077 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded39b3c-186e-4798-a5e3-eefcef9ebd41" containerName="registry-server" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.211091 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e1c696-7a58-4427-a56d-6a774fa06532" containerName="registry-server" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.214537 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" (UID: "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.214577 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" (UID: "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.214729 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.220243 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-kube-api-access-65qg2" (OuterVolumeSpecName: "kube-api-access-65qg2") pod "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" (UID: "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2"). InnerVolumeSpecName "kube-api-access-65qg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.220450 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" (UID: "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.224423 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" (UID: "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.225772 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" (UID: "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.225933 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" (UID: "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.227329 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" (UID: "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.228990 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" (UID: "a2755894-dd6a-4e32-9ed7-a04e6e6f92f2"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.234384 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-95988dd86-wqvmv"] Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.272110 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.272163 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.272177 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.272187 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.272202 4753 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.272211 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.272222 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65qg2\" (UniqueName: \"kubernetes.io/projected/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-kube-api-access-65qg2\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.272233 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.272246 4753 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.272259 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.272271 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.272287 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.272302 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.272313 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.373167 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/54931658-1d67-458c-89b8-dc5ff2500ae7-audit-dir\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.373220 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.373241 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.373266 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.373754 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/54931658-1d67-458c-89b8-dc5ff2500ae7-audit-policies\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.373839 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-session\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.373864 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.373894 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-user-template-error\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.373913 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-router-certs\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.373966 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.374023 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-user-template-login\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.374042 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47ldr\" (UniqueName: \"kubernetes.io/projected/54931658-1d67-458c-89b8-dc5ff2500ae7-kube-api-access-47ldr\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.374061 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-service-ca\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.374122 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.475830 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.477237 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.477405 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/54931658-1d67-458c-89b8-dc5ff2500ae7-audit-policies\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.477554 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-session\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.477671 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.477798 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-user-template-error\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.477913 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-router-certs\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.478034 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.478098 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/54931658-1d67-458c-89b8-dc5ff2500ae7-audit-policies\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.478279 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-user-template-login\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.478391 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47ldr\" (UniqueName: \"kubernetes.io/projected/54931658-1d67-458c-89b8-dc5ff2500ae7-kube-api-access-47ldr\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.478497 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-service-ca\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.478607 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.478722 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/54931658-1d67-458c-89b8-dc5ff2500ae7-audit-dir\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.478920 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.479006 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.479227 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/54931658-1d67-458c-89b8-dc5ff2500ae7-audit-dir\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.479927 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-service-ca\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.478817 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.482655 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-session\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.482695 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-user-template-login\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.482913 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.482992 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.483458 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.483932 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.483932 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-user-template-error\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.486271 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/54931658-1d67-458c-89b8-dc5ff2500ae7-v4-0-config-system-router-certs\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.497130 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47ldr\" (UniqueName: \"kubernetes.io/projected/54931658-1d67-458c-89b8-dc5ff2500ae7-kube-api-access-47ldr\") pod \"oauth-openshift-95988dd86-wqvmv\" (UID: \"54931658-1d67-458c-89b8-dc5ff2500ae7\") " pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.557960 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.846777 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-95988dd86-wqvmv"] Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.989545 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" event={"ID":"a2755894-dd6a-4e32-9ed7-a04e6e6f92f2","Type":"ContainerDied","Data":"30ff1ada779e08e24d683997af2522163b5541d6439bf424df6ef278626ec522"} Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.989612 4753 scope.go:117] "RemoveContainer" containerID="77d0d6b1bf87cd470462bd298023aa406ffc4be5c01b9aa6c67a5b9228179b40" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.989648 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tpd8r" Oct 05 20:18:57 crc kubenswrapper[4753]: I1005 20:18:57.991241 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" event={"ID":"54931658-1d67-458c-89b8-dc5ff2500ae7","Type":"ContainerStarted","Data":"92e78293dcb73bec6d242714e8be8716d6a3c1224bf3367df008b99e6742cbb8"} Oct 05 20:18:58 crc kubenswrapper[4753]: I1005 20:18:58.015881 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tpd8r"] Oct 05 20:18:58 crc kubenswrapper[4753]: I1005 20:18:58.018693 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tpd8r"] Oct 05 20:18:59 crc kubenswrapper[4753]: I1005 20:18:59.002938 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" event={"ID":"54931658-1d67-458c-89b8-dc5ff2500ae7","Type":"ContainerStarted","Data":"a424c020c87d6a0a5a6f921461f32aab77baf16ae8c20dc5f771787496f27220"} Oct 05 20:18:59 crc kubenswrapper[4753]: I1005 20:18:59.003411 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:59 crc kubenswrapper[4753]: I1005 20:18:59.010724 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" Oct 05 20:18:59 crc kubenswrapper[4753]: I1005 20:18:59.039837 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-95988dd86-wqvmv" podStartSLOduration=28.039800976 podStartE2EDuration="28.039800976s" podCreationTimestamp="2025-10-05 20:18:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:18:59.03779194 +0000 UTC m=+247.886120262" watchObservedRunningTime="2025-10-05 20:18:59.039800976 +0000 UTC m=+247.888129248" Oct 05 20:18:59 crc kubenswrapper[4753]: I1005 20:18:59.889527 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2755894-dd6a-4e32-9ed7-a04e6e6f92f2" path="/var/lib/kubelet/pods/a2755894-dd6a-4e32-9ed7-a04e6e6f92f2/volumes" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.109322 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vm5zp"] Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.110381 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vm5zp" podUID="846a9f6b-930c-4527-9b57-c8ca70b345d0" containerName="registry-server" containerID="cri-o://297ec75f0e8766754f4ee23a52357d7fa22c76473434d5fa3513ea22c2cfcb1e" gracePeriod=30 Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.123693 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hldsk"] Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.123931 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hldsk" podUID="befdf821-5bbf-4606-970d-ed88cf79993d" containerName="registry-server" containerID="cri-o://dbcd15ac257176a4d9bb910b1871ca0ec6384c2785a5f7e8aabfa2862fc40ed4" gracePeriod=30 Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.130533 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gx875"] Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.130731 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-gx875" podUID="834546fa-9104-4394-a674-e0350de62fb1" containerName="marketplace-operator" containerID="cri-o://27f82debeab5b09bd56f19ef600a5745d895a5a0817a061216b7ac8a45d3b5f2" gracePeriod=30 Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.148345 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7dx25"] Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.148722 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7dx25" podUID="c0bfd881-720f-48b7-acdd-7bb4b3722471" containerName="registry-server" containerID="cri-o://4f66640b4f7098b76ab1a3c6a168d9b695eecd8d4944eb31f51c5bc46089f979" gracePeriod=30 Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.179128 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-98zw7"] Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.180213 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.184095 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xwh7d"] Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.184388 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xwh7d" podUID="431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" containerName="registry-server" containerID="cri-o://f7449c55e8f96e88395491531b7a87c218b6906ff0f7d79775475a2facd88619" gracePeriod=30 Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.185949 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1c9573cf-5cd6-4c3b-8c62-0766f942629a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-98zw7\" (UID: \"1c9573cf-5cd6-4c3b-8c62-0766f942629a\") " pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.185989 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c9573cf-5cd6-4c3b-8c62-0766f942629a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-98zw7\" (UID: \"1c9573cf-5cd6-4c3b-8c62-0766f942629a\") " pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.186039 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cpjj\" (UniqueName: \"kubernetes.io/projected/1c9573cf-5cd6-4c3b-8c62-0766f942629a-kube-api-access-9cpjj\") pod \"marketplace-operator-79b997595-98zw7\" (UID: \"1c9573cf-5cd6-4c3b-8c62-0766f942629a\") " pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.192058 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-98zw7"] Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.286722 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cpjj\" (UniqueName: \"kubernetes.io/projected/1c9573cf-5cd6-4c3b-8c62-0766f942629a-kube-api-access-9cpjj\") pod \"marketplace-operator-79b997595-98zw7\" (UID: \"1c9573cf-5cd6-4c3b-8c62-0766f942629a\") " pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.286811 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1c9573cf-5cd6-4c3b-8c62-0766f942629a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-98zw7\" (UID: \"1c9573cf-5cd6-4c3b-8c62-0766f942629a\") " pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.286842 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c9573cf-5cd6-4c3b-8c62-0766f942629a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-98zw7\" (UID: \"1c9573cf-5cd6-4c3b-8c62-0766f942629a\") " pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.287919 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c9573cf-5cd6-4c3b-8c62-0766f942629a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-98zw7\" (UID: \"1c9573cf-5cd6-4c3b-8c62-0766f942629a\") " pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.294046 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1c9573cf-5cd6-4c3b-8c62-0766f942629a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-98zw7\" (UID: \"1c9573cf-5cd6-4c3b-8c62-0766f942629a\") " pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.303181 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cpjj\" (UniqueName: \"kubernetes.io/projected/1c9573cf-5cd6-4c3b-8c62-0766f942629a-kube-api-access-9cpjj\") pod \"marketplace-operator-79b997595-98zw7\" (UID: \"1c9573cf-5cd6-4c3b-8c62-0766f942629a\") " pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.499569 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gx875" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.501279 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.580948 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.691071 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/834546fa-9104-4394-a674-e0350de62fb1-marketplace-trusted-ca\") pod \"834546fa-9104-4394-a674-e0350de62fb1\" (UID: \"834546fa-9104-4394-a674-e0350de62fb1\") " Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.691452 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mrvv\" (UniqueName: \"kubernetes.io/projected/befdf821-5bbf-4606-970d-ed88cf79993d-kube-api-access-6mrvv\") pod \"befdf821-5bbf-4606-970d-ed88cf79993d\" (UID: \"befdf821-5bbf-4606-970d-ed88cf79993d\") " Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.691487 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/834546fa-9104-4394-a674-e0350de62fb1-marketplace-operator-metrics\") pod \"834546fa-9104-4394-a674-e0350de62fb1\" (UID: \"834546fa-9104-4394-a674-e0350de62fb1\") " Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.691524 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/befdf821-5bbf-4606-970d-ed88cf79993d-catalog-content\") pod \"befdf821-5bbf-4606-970d-ed88cf79993d\" (UID: \"befdf821-5bbf-4606-970d-ed88cf79993d\") " Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.691588 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vt2k\" (UniqueName: \"kubernetes.io/projected/834546fa-9104-4394-a674-e0350de62fb1-kube-api-access-9vt2k\") pod \"834546fa-9104-4394-a674-e0350de62fb1\" (UID: \"834546fa-9104-4394-a674-e0350de62fb1\") " Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.691619 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/befdf821-5bbf-4606-970d-ed88cf79993d-utilities\") pod \"befdf821-5bbf-4606-970d-ed88cf79993d\" (UID: \"befdf821-5bbf-4606-970d-ed88cf79993d\") " Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.693512 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/befdf821-5bbf-4606-970d-ed88cf79993d-utilities" (OuterVolumeSpecName: "utilities") pod "befdf821-5bbf-4606-970d-ed88cf79993d" (UID: "befdf821-5bbf-4606-970d-ed88cf79993d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.693564 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/834546fa-9104-4394-a674-e0350de62fb1-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "834546fa-9104-4394-a674-e0350de62fb1" (UID: "834546fa-9104-4394-a674-e0350de62fb1"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.699867 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/834546fa-9104-4394-a674-e0350de62fb1-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "834546fa-9104-4394-a674-e0350de62fb1" (UID: "834546fa-9104-4394-a674-e0350de62fb1"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.704204 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/befdf821-5bbf-4606-970d-ed88cf79993d-kube-api-access-6mrvv" (OuterVolumeSpecName: "kube-api-access-6mrvv") pod "befdf821-5bbf-4606-970d-ed88cf79993d" (UID: "befdf821-5bbf-4606-970d-ed88cf79993d"). InnerVolumeSpecName "kube-api-access-6mrvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.704259 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/834546fa-9104-4394-a674-e0350de62fb1-kube-api-access-9vt2k" (OuterVolumeSpecName: "kube-api-access-9vt2k") pod "834546fa-9104-4394-a674-e0350de62fb1" (UID: "834546fa-9104-4394-a674-e0350de62fb1"). InnerVolumeSpecName "kube-api-access-9vt2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.724996 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.746192 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.757568 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.793339 4753 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/834546fa-9104-4394-a674-e0350de62fb1-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.793381 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vt2k\" (UniqueName: \"kubernetes.io/projected/834546fa-9104-4394-a674-e0350de62fb1-kube-api-access-9vt2k\") on node \"crc\" DevicePath \"\"" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.793392 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/befdf821-5bbf-4606-970d-ed88cf79993d-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.793403 4753 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/834546fa-9104-4394-a674-e0350de62fb1-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.793411 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mrvv\" (UniqueName: \"kubernetes.io/projected/befdf821-5bbf-4606-970d-ed88cf79993d-kube-api-access-6mrvv\") on node \"crc\" DevicePath \"\"" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.795197 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/befdf821-5bbf-4606-970d-ed88cf79993d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "befdf821-5bbf-4606-970d-ed88cf79993d" (UID: "befdf821-5bbf-4606-970d-ed88cf79993d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.894101 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-utilities\") pod \"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4\" (UID: \"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4\") " Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.894219 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh9xd\" (UniqueName: \"kubernetes.io/projected/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-kube-api-access-qh9xd\") pod \"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4\" (UID: \"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4\") " Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.894251 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/846a9f6b-930c-4527-9b57-c8ca70b345d0-catalog-content\") pod \"846a9f6b-930c-4527-9b57-c8ca70b345d0\" (UID: \"846a9f6b-930c-4527-9b57-c8ca70b345d0\") " Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.894267 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/846a9f6b-930c-4527-9b57-c8ca70b345d0-utilities\") pod \"846a9f6b-930c-4527-9b57-c8ca70b345d0\" (UID: \"846a9f6b-930c-4527-9b57-c8ca70b345d0\") " Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.894298 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0bfd881-720f-48b7-acdd-7bb4b3722471-utilities\") pod \"c0bfd881-720f-48b7-acdd-7bb4b3722471\" (UID: \"c0bfd881-720f-48b7-acdd-7bb4b3722471\") " Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.894316 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28zrb\" (UniqueName: \"kubernetes.io/projected/c0bfd881-720f-48b7-acdd-7bb4b3722471-kube-api-access-28zrb\") pod \"c0bfd881-720f-48b7-acdd-7bb4b3722471\" (UID: \"c0bfd881-720f-48b7-acdd-7bb4b3722471\") " Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.894333 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0bfd881-720f-48b7-acdd-7bb4b3722471-catalog-content\") pod \"c0bfd881-720f-48b7-acdd-7bb4b3722471\" (UID: \"c0bfd881-720f-48b7-acdd-7bb4b3722471\") " Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.894361 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhz65\" (UniqueName: \"kubernetes.io/projected/846a9f6b-930c-4527-9b57-c8ca70b345d0-kube-api-access-qhz65\") pod \"846a9f6b-930c-4527-9b57-c8ca70b345d0\" (UID: \"846a9f6b-930c-4527-9b57-c8ca70b345d0\") " Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.894397 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-catalog-content\") pod \"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4\" (UID: \"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4\") " Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.894610 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/befdf821-5bbf-4606-970d-ed88cf79993d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.895194 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0bfd881-720f-48b7-acdd-7bb4b3722471-utilities" (OuterVolumeSpecName: "utilities") pod "c0bfd881-720f-48b7-acdd-7bb4b3722471" (UID: "c0bfd881-720f-48b7-acdd-7bb4b3722471"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.895256 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-utilities" (OuterVolumeSpecName: "utilities") pod "431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" (UID: "431bdbc7-61ed-4d70-9d8c-6576bc51e2d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.895516 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/846a9f6b-930c-4527-9b57-c8ca70b345d0-utilities" (OuterVolumeSpecName: "utilities") pod "846a9f6b-930c-4527-9b57-c8ca70b345d0" (UID: "846a9f6b-930c-4527-9b57-c8ca70b345d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.909779 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0bfd881-720f-48b7-acdd-7bb4b3722471-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0bfd881-720f-48b7-acdd-7bb4b3722471" (UID: "c0bfd881-720f-48b7-acdd-7bb4b3722471"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.910387 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/846a9f6b-930c-4527-9b57-c8ca70b345d0-kube-api-access-qhz65" (OuterVolumeSpecName: "kube-api-access-qhz65") pod "846a9f6b-930c-4527-9b57-c8ca70b345d0" (UID: "846a9f6b-930c-4527-9b57-c8ca70b345d0"). InnerVolumeSpecName "kube-api-access-qhz65". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.910468 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-kube-api-access-qh9xd" (OuterVolumeSpecName: "kube-api-access-qh9xd") pod "431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" (UID: "431bdbc7-61ed-4d70-9d8c-6576bc51e2d4"). InnerVolumeSpecName "kube-api-access-qh9xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.913648 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0bfd881-720f-48b7-acdd-7bb4b3722471-kube-api-access-28zrb" (OuterVolumeSpecName: "kube-api-access-28zrb") pod "c0bfd881-720f-48b7-acdd-7bb4b3722471" (UID: "c0bfd881-720f-48b7-acdd-7bb4b3722471"). InnerVolumeSpecName "kube-api-access-28zrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.971356 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/846a9f6b-930c-4527-9b57-c8ca70b345d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "846a9f6b-930c-4527-9b57-c8ca70b345d0" (UID: "846a9f6b-930c-4527-9b57-c8ca70b345d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.995065 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0bfd881-720f-48b7-acdd-7bb4b3722471-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.995693 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28zrb\" (UniqueName: \"kubernetes.io/projected/c0bfd881-720f-48b7-acdd-7bb4b3722471-kube-api-access-28zrb\") on node \"crc\" DevicePath \"\"" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.995763 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0bfd881-720f-48b7-acdd-7bb4b3722471-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.995821 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhz65\" (UniqueName: \"kubernetes.io/projected/846a9f6b-930c-4527-9b57-c8ca70b345d0-kube-api-access-qhz65\") on node \"crc\" DevicePath \"\"" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.995886 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.995952 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh9xd\" (UniqueName: \"kubernetes.io/projected/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-kube-api-access-qh9xd\") on node \"crc\" DevicePath \"\"" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.996015 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/846a9f6b-930c-4527-9b57-c8ca70b345d0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.996076 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/846a9f6b-930c-4527-9b57-c8ca70b345d0-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.996638 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-98zw7"] Oct 05 20:19:18 crc kubenswrapper[4753]: I1005 20:19:18.997558 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" (UID: "431bdbc7-61ed-4d70-9d8c-6576bc51e2d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.096724 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.123903 4753 generic.go:334] "Generic (PLEG): container finished" podID="846a9f6b-930c-4527-9b57-c8ca70b345d0" containerID="297ec75f0e8766754f4ee23a52357d7fa22c76473434d5fa3513ea22c2cfcb1e" exitCode=0 Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.124034 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vm5zp" event={"ID":"846a9f6b-930c-4527-9b57-c8ca70b345d0","Type":"ContainerDied","Data":"297ec75f0e8766754f4ee23a52357d7fa22c76473434d5fa3513ea22c2cfcb1e"} Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.125090 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vm5zp" event={"ID":"846a9f6b-930c-4527-9b57-c8ca70b345d0","Type":"ContainerDied","Data":"db3f8ececd5b1c0d0e84d0278ca5ff8f06351f734f6201009fafa10198c661f7"} Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.125127 4753 scope.go:117] "RemoveContainer" containerID="297ec75f0e8766754f4ee23a52357d7fa22c76473434d5fa3513ea22c2cfcb1e" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.124156 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vm5zp" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.127849 4753 generic.go:334] "Generic (PLEG): container finished" podID="c0bfd881-720f-48b7-acdd-7bb4b3722471" containerID="4f66640b4f7098b76ab1a3c6a168d9b695eecd8d4944eb31f51c5bc46089f979" exitCode=0 Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.127911 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dx25" event={"ID":"c0bfd881-720f-48b7-acdd-7bb4b3722471","Type":"ContainerDied","Data":"4f66640b4f7098b76ab1a3c6a168d9b695eecd8d4944eb31f51c5bc46089f979"} Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.127944 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dx25" event={"ID":"c0bfd881-720f-48b7-acdd-7bb4b3722471","Type":"ContainerDied","Data":"2c4b026c3300f58faef47435768d5f8369e9eaffe280a227f462a6390fa03ec4"} Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.128018 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7dx25" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.134260 4753 generic.go:334] "Generic (PLEG): container finished" podID="834546fa-9104-4394-a674-e0350de62fb1" containerID="27f82debeab5b09bd56f19ef600a5745d895a5a0817a061216b7ac8a45d3b5f2" exitCode=0 Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.134453 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gx875" event={"ID":"834546fa-9104-4394-a674-e0350de62fb1","Type":"ContainerDied","Data":"27f82debeab5b09bd56f19ef600a5745d895a5a0817a061216b7ac8a45d3b5f2"} Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.134577 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gx875" event={"ID":"834546fa-9104-4394-a674-e0350de62fb1","Type":"ContainerDied","Data":"f9ed716ca58adfc1bd9841ddb4c9df3099480492ffee6e7c686b365710e93e94"} Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.134756 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gx875" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.140200 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" event={"ID":"1c9573cf-5cd6-4c3b-8c62-0766f942629a","Type":"ContainerStarted","Data":"ee15ba9233418299c0ea03eaff984effb9da6447862cef99ec1035d5bd4c95e3"} Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.140389 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" event={"ID":"1c9573cf-5cd6-4c3b-8c62-0766f942629a","Type":"ContainerStarted","Data":"0db16f8e784cf98093422aac96b785ba20adcc8519c9941047a458e19e6a6f36"} Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.143869 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.144115 4753 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-98zw7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" start-of-body= Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.144267 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" podUID="1c9573cf-5cd6-4c3b-8c62-0766f942629a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.145867 4753 scope.go:117] "RemoveContainer" containerID="051f249e6c3e3695bad48c340baa44157c9f5e7457c680751043e8c090962b36" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.148719 4753 generic.go:334] "Generic (PLEG): container finished" podID="431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" containerID="f7449c55e8f96e88395491531b7a87c218b6906ff0f7d79775475a2facd88619" exitCode=0 Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.148780 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwh7d" event={"ID":"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4","Type":"ContainerDied","Data":"f7449c55e8f96e88395491531b7a87c218b6906ff0f7d79775475a2facd88619"} Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.148809 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwh7d" event={"ID":"431bdbc7-61ed-4d70-9d8c-6576bc51e2d4","Type":"ContainerDied","Data":"c19884472f37aa9fd78e2a2faf72acc8ca443ea504f5362a81a80427fc7cecc2"} Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.148883 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwh7d" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.158501 4753 generic.go:334] "Generic (PLEG): container finished" podID="befdf821-5bbf-4606-970d-ed88cf79993d" containerID="dbcd15ac257176a4d9bb910b1871ca0ec6384c2785a5f7e8aabfa2862fc40ed4" exitCode=0 Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.158543 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hldsk" event={"ID":"befdf821-5bbf-4606-970d-ed88cf79993d","Type":"ContainerDied","Data":"dbcd15ac257176a4d9bb910b1871ca0ec6384c2785a5f7e8aabfa2862fc40ed4"} Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.158571 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hldsk" event={"ID":"befdf821-5bbf-4606-970d-ed88cf79993d","Type":"ContainerDied","Data":"13c450557ee0aba43b33f09e9bfd9f9c74b8fcf80b043855c94b368def418e83"} Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.158633 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hldsk" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.162112 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" podStartSLOduration=1.162098469 podStartE2EDuration="1.162098469s" podCreationTimestamp="2025-10-05 20:19:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:19:19.161127636 +0000 UTC m=+268.009455868" watchObservedRunningTime="2025-10-05 20:19:19.162098469 +0000 UTC m=+268.010426701" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.181764 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vm5zp"] Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.187116 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vm5zp"] Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.190687 4753 scope.go:117] "RemoveContainer" containerID="a096ab7d1afce8c26bac59e3e529bd11c0abcb608d8d1a8fb5ee636a1bb84a9b" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.193074 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gx875"] Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.201539 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gx875"] Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.214024 4753 scope.go:117] "RemoveContainer" containerID="297ec75f0e8766754f4ee23a52357d7fa22c76473434d5fa3513ea22c2cfcb1e" Oct 05 20:19:19 crc kubenswrapper[4753]: E1005 20:19:19.217109 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"297ec75f0e8766754f4ee23a52357d7fa22c76473434d5fa3513ea22c2cfcb1e\": container with ID starting with 297ec75f0e8766754f4ee23a52357d7fa22c76473434d5fa3513ea22c2cfcb1e not found: ID does not exist" containerID="297ec75f0e8766754f4ee23a52357d7fa22c76473434d5fa3513ea22c2cfcb1e" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.217171 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"297ec75f0e8766754f4ee23a52357d7fa22c76473434d5fa3513ea22c2cfcb1e"} err="failed to get container status \"297ec75f0e8766754f4ee23a52357d7fa22c76473434d5fa3513ea22c2cfcb1e\": rpc error: code = NotFound desc = could not find container \"297ec75f0e8766754f4ee23a52357d7fa22c76473434d5fa3513ea22c2cfcb1e\": container with ID starting with 297ec75f0e8766754f4ee23a52357d7fa22c76473434d5fa3513ea22c2cfcb1e not found: ID does not exist" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.217200 4753 scope.go:117] "RemoveContainer" containerID="051f249e6c3e3695bad48c340baa44157c9f5e7457c680751043e8c090962b36" Oct 05 20:19:19 crc kubenswrapper[4753]: E1005 20:19:19.222344 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"051f249e6c3e3695bad48c340baa44157c9f5e7457c680751043e8c090962b36\": container with ID starting with 051f249e6c3e3695bad48c340baa44157c9f5e7457c680751043e8c090962b36 not found: ID does not exist" containerID="051f249e6c3e3695bad48c340baa44157c9f5e7457c680751043e8c090962b36" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.222401 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"051f249e6c3e3695bad48c340baa44157c9f5e7457c680751043e8c090962b36"} err="failed to get container status \"051f249e6c3e3695bad48c340baa44157c9f5e7457c680751043e8c090962b36\": rpc error: code = NotFound desc = could not find container \"051f249e6c3e3695bad48c340baa44157c9f5e7457c680751043e8c090962b36\": container with ID starting with 051f249e6c3e3695bad48c340baa44157c9f5e7457c680751043e8c090962b36 not found: ID does not exist" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.222447 4753 scope.go:117] "RemoveContainer" containerID="a096ab7d1afce8c26bac59e3e529bd11c0abcb608d8d1a8fb5ee636a1bb84a9b" Oct 05 20:19:19 crc kubenswrapper[4753]: E1005 20:19:19.222965 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a096ab7d1afce8c26bac59e3e529bd11c0abcb608d8d1a8fb5ee636a1bb84a9b\": container with ID starting with a096ab7d1afce8c26bac59e3e529bd11c0abcb608d8d1a8fb5ee636a1bb84a9b not found: ID does not exist" containerID="a096ab7d1afce8c26bac59e3e529bd11c0abcb608d8d1a8fb5ee636a1bb84a9b" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.222999 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a096ab7d1afce8c26bac59e3e529bd11c0abcb608d8d1a8fb5ee636a1bb84a9b"} err="failed to get container status \"a096ab7d1afce8c26bac59e3e529bd11c0abcb608d8d1a8fb5ee636a1bb84a9b\": rpc error: code = NotFound desc = could not find container \"a096ab7d1afce8c26bac59e3e529bd11c0abcb608d8d1a8fb5ee636a1bb84a9b\": container with ID starting with a096ab7d1afce8c26bac59e3e529bd11c0abcb608d8d1a8fb5ee636a1bb84a9b not found: ID does not exist" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.223015 4753 scope.go:117] "RemoveContainer" containerID="4f66640b4f7098b76ab1a3c6a168d9b695eecd8d4944eb31f51c5bc46089f979" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.228876 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7dx25"] Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.240624 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7dx25"] Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.243049 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xwh7d"] Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.249203 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xwh7d"] Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.254543 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hldsk"] Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.256508 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hldsk"] Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.283466 4753 scope.go:117] "RemoveContainer" containerID="90b78000311c5dfc217aa53af69be66582d9ef3da54e02bf8b9b468004bbeeef" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.303617 4753 scope.go:117] "RemoveContainer" containerID="64b9605d6a8db1f69e151a64497c4f9018a07ce2deb8af5c63de768e7ff270ee" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.328371 4753 scope.go:117] "RemoveContainer" containerID="4f66640b4f7098b76ab1a3c6a168d9b695eecd8d4944eb31f51c5bc46089f979" Oct 05 20:19:19 crc kubenswrapper[4753]: E1005 20:19:19.328980 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f66640b4f7098b76ab1a3c6a168d9b695eecd8d4944eb31f51c5bc46089f979\": container with ID starting with 4f66640b4f7098b76ab1a3c6a168d9b695eecd8d4944eb31f51c5bc46089f979 not found: ID does not exist" containerID="4f66640b4f7098b76ab1a3c6a168d9b695eecd8d4944eb31f51c5bc46089f979" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.329011 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f66640b4f7098b76ab1a3c6a168d9b695eecd8d4944eb31f51c5bc46089f979"} err="failed to get container status \"4f66640b4f7098b76ab1a3c6a168d9b695eecd8d4944eb31f51c5bc46089f979\": rpc error: code = NotFound desc = could not find container \"4f66640b4f7098b76ab1a3c6a168d9b695eecd8d4944eb31f51c5bc46089f979\": container with ID starting with 4f66640b4f7098b76ab1a3c6a168d9b695eecd8d4944eb31f51c5bc46089f979 not found: ID does not exist" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.329034 4753 scope.go:117] "RemoveContainer" containerID="90b78000311c5dfc217aa53af69be66582d9ef3da54e02bf8b9b468004bbeeef" Oct 05 20:19:19 crc kubenswrapper[4753]: E1005 20:19:19.329416 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90b78000311c5dfc217aa53af69be66582d9ef3da54e02bf8b9b468004bbeeef\": container with ID starting with 90b78000311c5dfc217aa53af69be66582d9ef3da54e02bf8b9b468004bbeeef not found: ID does not exist" containerID="90b78000311c5dfc217aa53af69be66582d9ef3da54e02bf8b9b468004bbeeef" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.329438 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90b78000311c5dfc217aa53af69be66582d9ef3da54e02bf8b9b468004bbeeef"} err="failed to get container status \"90b78000311c5dfc217aa53af69be66582d9ef3da54e02bf8b9b468004bbeeef\": rpc error: code = NotFound desc = could not find container \"90b78000311c5dfc217aa53af69be66582d9ef3da54e02bf8b9b468004bbeeef\": container with ID starting with 90b78000311c5dfc217aa53af69be66582d9ef3da54e02bf8b9b468004bbeeef not found: ID does not exist" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.329453 4753 scope.go:117] "RemoveContainer" containerID="64b9605d6a8db1f69e151a64497c4f9018a07ce2deb8af5c63de768e7ff270ee" Oct 05 20:19:19 crc kubenswrapper[4753]: E1005 20:19:19.329637 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64b9605d6a8db1f69e151a64497c4f9018a07ce2deb8af5c63de768e7ff270ee\": container with ID starting with 64b9605d6a8db1f69e151a64497c4f9018a07ce2deb8af5c63de768e7ff270ee not found: ID does not exist" containerID="64b9605d6a8db1f69e151a64497c4f9018a07ce2deb8af5c63de768e7ff270ee" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.329657 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64b9605d6a8db1f69e151a64497c4f9018a07ce2deb8af5c63de768e7ff270ee"} err="failed to get container status \"64b9605d6a8db1f69e151a64497c4f9018a07ce2deb8af5c63de768e7ff270ee\": rpc error: code = NotFound desc = could not find container \"64b9605d6a8db1f69e151a64497c4f9018a07ce2deb8af5c63de768e7ff270ee\": container with ID starting with 64b9605d6a8db1f69e151a64497c4f9018a07ce2deb8af5c63de768e7ff270ee not found: ID does not exist" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.329688 4753 scope.go:117] "RemoveContainer" containerID="27f82debeab5b09bd56f19ef600a5745d895a5a0817a061216b7ac8a45d3b5f2" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.342672 4753 scope.go:117] "RemoveContainer" containerID="27f82debeab5b09bd56f19ef600a5745d895a5a0817a061216b7ac8a45d3b5f2" Oct 05 20:19:19 crc kubenswrapper[4753]: E1005 20:19:19.343452 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f82debeab5b09bd56f19ef600a5745d895a5a0817a061216b7ac8a45d3b5f2\": container with ID starting with 27f82debeab5b09bd56f19ef600a5745d895a5a0817a061216b7ac8a45d3b5f2 not found: ID does not exist" containerID="27f82debeab5b09bd56f19ef600a5745d895a5a0817a061216b7ac8a45d3b5f2" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.343476 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f82debeab5b09bd56f19ef600a5745d895a5a0817a061216b7ac8a45d3b5f2"} err="failed to get container status \"27f82debeab5b09bd56f19ef600a5745d895a5a0817a061216b7ac8a45d3b5f2\": rpc error: code = NotFound desc = could not find container \"27f82debeab5b09bd56f19ef600a5745d895a5a0817a061216b7ac8a45d3b5f2\": container with ID starting with 27f82debeab5b09bd56f19ef600a5745d895a5a0817a061216b7ac8a45d3b5f2 not found: ID does not exist" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.343491 4753 scope.go:117] "RemoveContainer" containerID="f7449c55e8f96e88395491531b7a87c218b6906ff0f7d79775475a2facd88619" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.354552 4753 scope.go:117] "RemoveContainer" containerID="655d2ada972eeae271b8283a2c9a51210ec9ac61f0cc1398787b2ec6671b4a33" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.372406 4753 scope.go:117] "RemoveContainer" containerID="25c45cc5d0c58c65554d2e948c2219792a406c46f4cb233dff98d2b90b27bc38" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.385175 4753 scope.go:117] "RemoveContainer" containerID="f7449c55e8f96e88395491531b7a87c218b6906ff0f7d79775475a2facd88619" Oct 05 20:19:19 crc kubenswrapper[4753]: E1005 20:19:19.386388 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7449c55e8f96e88395491531b7a87c218b6906ff0f7d79775475a2facd88619\": container with ID starting with f7449c55e8f96e88395491531b7a87c218b6906ff0f7d79775475a2facd88619 not found: ID does not exist" containerID="f7449c55e8f96e88395491531b7a87c218b6906ff0f7d79775475a2facd88619" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.386417 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7449c55e8f96e88395491531b7a87c218b6906ff0f7d79775475a2facd88619"} err="failed to get container status \"f7449c55e8f96e88395491531b7a87c218b6906ff0f7d79775475a2facd88619\": rpc error: code = NotFound desc = could not find container \"f7449c55e8f96e88395491531b7a87c218b6906ff0f7d79775475a2facd88619\": container with ID starting with f7449c55e8f96e88395491531b7a87c218b6906ff0f7d79775475a2facd88619 not found: ID does not exist" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.386446 4753 scope.go:117] "RemoveContainer" containerID="655d2ada972eeae271b8283a2c9a51210ec9ac61f0cc1398787b2ec6671b4a33" Oct 05 20:19:19 crc kubenswrapper[4753]: E1005 20:19:19.386876 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"655d2ada972eeae271b8283a2c9a51210ec9ac61f0cc1398787b2ec6671b4a33\": container with ID starting with 655d2ada972eeae271b8283a2c9a51210ec9ac61f0cc1398787b2ec6671b4a33 not found: ID does not exist" containerID="655d2ada972eeae271b8283a2c9a51210ec9ac61f0cc1398787b2ec6671b4a33" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.386901 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"655d2ada972eeae271b8283a2c9a51210ec9ac61f0cc1398787b2ec6671b4a33"} err="failed to get container status \"655d2ada972eeae271b8283a2c9a51210ec9ac61f0cc1398787b2ec6671b4a33\": rpc error: code = NotFound desc = could not find container \"655d2ada972eeae271b8283a2c9a51210ec9ac61f0cc1398787b2ec6671b4a33\": container with ID starting with 655d2ada972eeae271b8283a2c9a51210ec9ac61f0cc1398787b2ec6671b4a33 not found: ID does not exist" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.386915 4753 scope.go:117] "RemoveContainer" containerID="25c45cc5d0c58c65554d2e948c2219792a406c46f4cb233dff98d2b90b27bc38" Oct 05 20:19:19 crc kubenswrapper[4753]: E1005 20:19:19.387376 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25c45cc5d0c58c65554d2e948c2219792a406c46f4cb233dff98d2b90b27bc38\": container with ID starting with 25c45cc5d0c58c65554d2e948c2219792a406c46f4cb233dff98d2b90b27bc38 not found: ID does not exist" containerID="25c45cc5d0c58c65554d2e948c2219792a406c46f4cb233dff98d2b90b27bc38" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.387398 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25c45cc5d0c58c65554d2e948c2219792a406c46f4cb233dff98d2b90b27bc38"} err="failed to get container status \"25c45cc5d0c58c65554d2e948c2219792a406c46f4cb233dff98d2b90b27bc38\": rpc error: code = NotFound desc = could not find container \"25c45cc5d0c58c65554d2e948c2219792a406c46f4cb233dff98d2b90b27bc38\": container with ID starting with 25c45cc5d0c58c65554d2e948c2219792a406c46f4cb233dff98d2b90b27bc38 not found: ID does not exist" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.387413 4753 scope.go:117] "RemoveContainer" containerID="dbcd15ac257176a4d9bb910b1871ca0ec6384c2785a5f7e8aabfa2862fc40ed4" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.400811 4753 scope.go:117] "RemoveContainer" containerID="f4ad276cf1611492eeb4f394d0668466e6eb70bed0f578f84682d0508e441c2d" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.414189 4753 scope.go:117] "RemoveContainer" containerID="49480f646d71f5338fe80bdfee1ac389cffaba096294c38a080c007af7d78248" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.424952 4753 scope.go:117] "RemoveContainer" containerID="dbcd15ac257176a4d9bb910b1871ca0ec6384c2785a5f7e8aabfa2862fc40ed4" Oct 05 20:19:19 crc kubenswrapper[4753]: E1005 20:19:19.425365 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbcd15ac257176a4d9bb910b1871ca0ec6384c2785a5f7e8aabfa2862fc40ed4\": container with ID starting with dbcd15ac257176a4d9bb910b1871ca0ec6384c2785a5f7e8aabfa2862fc40ed4 not found: ID does not exist" containerID="dbcd15ac257176a4d9bb910b1871ca0ec6384c2785a5f7e8aabfa2862fc40ed4" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.425389 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbcd15ac257176a4d9bb910b1871ca0ec6384c2785a5f7e8aabfa2862fc40ed4"} err="failed to get container status \"dbcd15ac257176a4d9bb910b1871ca0ec6384c2785a5f7e8aabfa2862fc40ed4\": rpc error: code = NotFound desc = could not find container \"dbcd15ac257176a4d9bb910b1871ca0ec6384c2785a5f7e8aabfa2862fc40ed4\": container with ID starting with dbcd15ac257176a4d9bb910b1871ca0ec6384c2785a5f7e8aabfa2862fc40ed4 not found: ID does not exist" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.425413 4753 scope.go:117] "RemoveContainer" containerID="f4ad276cf1611492eeb4f394d0668466e6eb70bed0f578f84682d0508e441c2d" Oct 05 20:19:19 crc kubenswrapper[4753]: E1005 20:19:19.425939 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4ad276cf1611492eeb4f394d0668466e6eb70bed0f578f84682d0508e441c2d\": container with ID starting with f4ad276cf1611492eeb4f394d0668466e6eb70bed0f578f84682d0508e441c2d not found: ID does not exist" containerID="f4ad276cf1611492eeb4f394d0668466e6eb70bed0f578f84682d0508e441c2d" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.425955 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ad276cf1611492eeb4f394d0668466e6eb70bed0f578f84682d0508e441c2d"} err="failed to get container status \"f4ad276cf1611492eeb4f394d0668466e6eb70bed0f578f84682d0508e441c2d\": rpc error: code = NotFound desc = could not find container \"f4ad276cf1611492eeb4f394d0668466e6eb70bed0f578f84682d0508e441c2d\": container with ID starting with f4ad276cf1611492eeb4f394d0668466e6eb70bed0f578f84682d0508e441c2d not found: ID does not exist" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.425967 4753 scope.go:117] "RemoveContainer" containerID="49480f646d71f5338fe80bdfee1ac389cffaba096294c38a080c007af7d78248" Oct 05 20:19:19 crc kubenswrapper[4753]: E1005 20:19:19.426224 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49480f646d71f5338fe80bdfee1ac389cffaba096294c38a080c007af7d78248\": container with ID starting with 49480f646d71f5338fe80bdfee1ac389cffaba096294c38a080c007af7d78248 not found: ID does not exist" containerID="49480f646d71f5338fe80bdfee1ac389cffaba096294c38a080c007af7d78248" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.426247 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49480f646d71f5338fe80bdfee1ac389cffaba096294c38a080c007af7d78248"} err="failed to get container status \"49480f646d71f5338fe80bdfee1ac389cffaba096294c38a080c007af7d78248\": rpc error: code = NotFound desc = could not find container \"49480f646d71f5338fe80bdfee1ac389cffaba096294c38a080c007af7d78248\": container with ID starting with 49480f646d71f5338fe80bdfee1ac389cffaba096294c38a080c007af7d78248 not found: ID does not exist" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.885034 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" path="/var/lib/kubelet/pods/431bdbc7-61ed-4d70-9d8c-6576bc51e2d4/volumes" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.886321 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="834546fa-9104-4394-a674-e0350de62fb1" path="/var/lib/kubelet/pods/834546fa-9104-4394-a674-e0350de62fb1/volumes" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.886918 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="846a9f6b-930c-4527-9b57-c8ca70b345d0" path="/var/lib/kubelet/pods/846a9f6b-930c-4527-9b57-c8ca70b345d0/volumes" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.888104 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="befdf821-5bbf-4606-970d-ed88cf79993d" path="/var/lib/kubelet/pods/befdf821-5bbf-4606-970d-ed88cf79993d/volumes" Oct 05 20:19:19 crc kubenswrapper[4753]: I1005 20:19:19.888940 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0bfd881-720f-48b7-acdd-7bb4b3722471" path="/var/lib/kubelet/pods/c0bfd881-720f-48b7-acdd-7bb4b3722471/volumes" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.172593 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-98zw7" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326331 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qnn2p"] Oct 05 20:19:20 crc kubenswrapper[4753]: E1005 20:19:20.326515 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834546fa-9104-4394-a674-e0350de62fb1" containerName="marketplace-operator" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326527 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="834546fa-9104-4394-a674-e0350de62fb1" containerName="marketplace-operator" Oct 05 20:19:20 crc kubenswrapper[4753]: E1005 20:19:20.326539 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="befdf821-5bbf-4606-970d-ed88cf79993d" containerName="extract-utilities" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326544 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="befdf821-5bbf-4606-970d-ed88cf79993d" containerName="extract-utilities" Oct 05 20:19:20 crc kubenswrapper[4753]: E1005 20:19:20.326552 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" containerName="registry-server" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326559 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" containerName="registry-server" Oct 05 20:19:20 crc kubenswrapper[4753]: E1005 20:19:20.326566 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="846a9f6b-930c-4527-9b57-c8ca70b345d0" containerName="extract-utilities" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326572 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="846a9f6b-930c-4527-9b57-c8ca70b345d0" containerName="extract-utilities" Oct 05 20:19:20 crc kubenswrapper[4753]: E1005 20:19:20.326580 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0bfd881-720f-48b7-acdd-7bb4b3722471" containerName="extract-content" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326586 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0bfd881-720f-48b7-acdd-7bb4b3722471" containerName="extract-content" Oct 05 20:19:20 crc kubenswrapper[4753]: E1005 20:19:20.326597 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="846a9f6b-930c-4527-9b57-c8ca70b345d0" containerName="registry-server" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326603 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="846a9f6b-930c-4527-9b57-c8ca70b345d0" containerName="registry-server" Oct 05 20:19:20 crc kubenswrapper[4753]: E1005 20:19:20.326614 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="846a9f6b-930c-4527-9b57-c8ca70b345d0" containerName="extract-content" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326619 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="846a9f6b-930c-4527-9b57-c8ca70b345d0" containerName="extract-content" Oct 05 20:19:20 crc kubenswrapper[4753]: E1005 20:19:20.326627 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="befdf821-5bbf-4606-970d-ed88cf79993d" containerName="extract-content" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326634 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="befdf821-5bbf-4606-970d-ed88cf79993d" containerName="extract-content" Oct 05 20:19:20 crc kubenswrapper[4753]: E1005 20:19:20.326655 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0bfd881-720f-48b7-acdd-7bb4b3722471" containerName="registry-server" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326661 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0bfd881-720f-48b7-acdd-7bb4b3722471" containerName="registry-server" Oct 05 20:19:20 crc kubenswrapper[4753]: E1005 20:19:20.326668 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" containerName="extract-utilities" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326674 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" containerName="extract-utilities" Oct 05 20:19:20 crc kubenswrapper[4753]: E1005 20:19:20.326682 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" containerName="extract-content" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326688 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" containerName="extract-content" Oct 05 20:19:20 crc kubenswrapper[4753]: E1005 20:19:20.326697 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0bfd881-720f-48b7-acdd-7bb4b3722471" containerName="extract-utilities" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326702 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0bfd881-720f-48b7-acdd-7bb4b3722471" containerName="extract-utilities" Oct 05 20:19:20 crc kubenswrapper[4753]: E1005 20:19:20.326709 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="befdf821-5bbf-4606-970d-ed88cf79993d" containerName="registry-server" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326715 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="befdf821-5bbf-4606-970d-ed88cf79993d" containerName="registry-server" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326791 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="834546fa-9104-4394-a674-e0350de62fb1" containerName="marketplace-operator" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326800 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0bfd881-720f-48b7-acdd-7bb4b3722471" containerName="registry-server" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326808 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="431bdbc7-61ed-4d70-9d8c-6576bc51e2d4" containerName="registry-server" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326815 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="846a9f6b-930c-4527-9b57-c8ca70b345d0" containerName="registry-server" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.326825 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="befdf821-5bbf-4606-970d-ed88cf79993d" containerName="registry-server" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.327470 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnn2p" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.329696 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.339015 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnn2p"] Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.513224 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b25c928-06b3-4bea-91cd-f1c609d1e785-catalog-content\") pod \"redhat-marketplace-qnn2p\" (UID: \"6b25c928-06b3-4bea-91cd-f1c609d1e785\") " pod="openshift-marketplace/redhat-marketplace-qnn2p" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.513279 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmjnc\" (UniqueName: \"kubernetes.io/projected/6b25c928-06b3-4bea-91cd-f1c609d1e785-kube-api-access-mmjnc\") pod \"redhat-marketplace-qnn2p\" (UID: \"6b25c928-06b3-4bea-91cd-f1c609d1e785\") " pod="openshift-marketplace/redhat-marketplace-qnn2p" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.513312 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b25c928-06b3-4bea-91cd-f1c609d1e785-utilities\") pod \"redhat-marketplace-qnn2p\" (UID: \"6b25c928-06b3-4bea-91cd-f1c609d1e785\") " pod="openshift-marketplace/redhat-marketplace-qnn2p" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.525617 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gdpm6"] Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.526565 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdpm6" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.529265 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.537658 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gdpm6"] Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.614195 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b25c928-06b3-4bea-91cd-f1c609d1e785-utilities\") pod \"redhat-marketplace-qnn2p\" (UID: \"6b25c928-06b3-4bea-91cd-f1c609d1e785\") " pod="openshift-marketplace/redhat-marketplace-qnn2p" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.614293 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b25c928-06b3-4bea-91cd-f1c609d1e785-catalog-content\") pod \"redhat-marketplace-qnn2p\" (UID: \"6b25c928-06b3-4bea-91cd-f1c609d1e785\") " pod="openshift-marketplace/redhat-marketplace-qnn2p" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.614314 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmjnc\" (UniqueName: \"kubernetes.io/projected/6b25c928-06b3-4bea-91cd-f1c609d1e785-kube-api-access-mmjnc\") pod \"redhat-marketplace-qnn2p\" (UID: \"6b25c928-06b3-4bea-91cd-f1c609d1e785\") " pod="openshift-marketplace/redhat-marketplace-qnn2p" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.614728 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b25c928-06b3-4bea-91cd-f1c609d1e785-utilities\") pod \"redhat-marketplace-qnn2p\" (UID: \"6b25c928-06b3-4bea-91cd-f1c609d1e785\") " pod="openshift-marketplace/redhat-marketplace-qnn2p" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.614787 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b25c928-06b3-4bea-91cd-f1c609d1e785-catalog-content\") pod \"redhat-marketplace-qnn2p\" (UID: \"6b25c928-06b3-4bea-91cd-f1c609d1e785\") " pod="openshift-marketplace/redhat-marketplace-qnn2p" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.636020 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmjnc\" (UniqueName: \"kubernetes.io/projected/6b25c928-06b3-4bea-91cd-f1c609d1e785-kube-api-access-mmjnc\") pod \"redhat-marketplace-qnn2p\" (UID: \"6b25c928-06b3-4bea-91cd-f1c609d1e785\") " pod="openshift-marketplace/redhat-marketplace-qnn2p" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.643271 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnn2p" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.718786 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4bd315d-8dd0-4844-a675-ddc48827669d-utilities\") pod \"certified-operators-gdpm6\" (UID: \"a4bd315d-8dd0-4844-a675-ddc48827669d\") " pod="openshift-marketplace/certified-operators-gdpm6" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.718830 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4bd315d-8dd0-4844-a675-ddc48827669d-catalog-content\") pod \"certified-operators-gdpm6\" (UID: \"a4bd315d-8dd0-4844-a675-ddc48827669d\") " pod="openshift-marketplace/certified-operators-gdpm6" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.718857 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prf5z\" (UniqueName: \"kubernetes.io/projected/a4bd315d-8dd0-4844-a675-ddc48827669d-kube-api-access-prf5z\") pod \"certified-operators-gdpm6\" (UID: \"a4bd315d-8dd0-4844-a675-ddc48827669d\") " pod="openshift-marketplace/certified-operators-gdpm6" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.820881 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4bd315d-8dd0-4844-a675-ddc48827669d-utilities\") pod \"certified-operators-gdpm6\" (UID: \"a4bd315d-8dd0-4844-a675-ddc48827669d\") " pod="openshift-marketplace/certified-operators-gdpm6" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.820927 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4bd315d-8dd0-4844-a675-ddc48827669d-catalog-content\") pod \"certified-operators-gdpm6\" (UID: \"a4bd315d-8dd0-4844-a675-ddc48827669d\") " pod="openshift-marketplace/certified-operators-gdpm6" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.820954 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prf5z\" (UniqueName: \"kubernetes.io/projected/a4bd315d-8dd0-4844-a675-ddc48827669d-kube-api-access-prf5z\") pod \"certified-operators-gdpm6\" (UID: \"a4bd315d-8dd0-4844-a675-ddc48827669d\") " pod="openshift-marketplace/certified-operators-gdpm6" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.821408 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4bd315d-8dd0-4844-a675-ddc48827669d-catalog-content\") pod \"certified-operators-gdpm6\" (UID: \"a4bd315d-8dd0-4844-a675-ddc48827669d\") " pod="openshift-marketplace/certified-operators-gdpm6" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.821560 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4bd315d-8dd0-4844-a675-ddc48827669d-utilities\") pod \"certified-operators-gdpm6\" (UID: \"a4bd315d-8dd0-4844-a675-ddc48827669d\") " pod="openshift-marketplace/certified-operators-gdpm6" Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.838381 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnn2p"] Oct 05 20:19:20 crc kubenswrapper[4753]: I1005 20:19:20.845773 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prf5z\" (UniqueName: \"kubernetes.io/projected/a4bd315d-8dd0-4844-a675-ddc48827669d-kube-api-access-prf5z\") pod \"certified-operators-gdpm6\" (UID: \"a4bd315d-8dd0-4844-a675-ddc48827669d\") " pod="openshift-marketplace/certified-operators-gdpm6" Oct 05 20:19:21 crc kubenswrapper[4753]: I1005 20:19:21.140391 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdpm6" Oct 05 20:19:21 crc kubenswrapper[4753]: I1005 20:19:21.175942 4753 generic.go:334] "Generic (PLEG): container finished" podID="6b25c928-06b3-4bea-91cd-f1c609d1e785" containerID="6e509146b06ddb8b4a14eb559f989a40de7476d46c6167fd174e183ee11152e8" exitCode=0 Oct 05 20:19:21 crc kubenswrapper[4753]: I1005 20:19:21.176131 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnn2p" event={"ID":"6b25c928-06b3-4bea-91cd-f1c609d1e785","Type":"ContainerDied","Data":"6e509146b06ddb8b4a14eb559f989a40de7476d46c6167fd174e183ee11152e8"} Oct 05 20:19:21 crc kubenswrapper[4753]: I1005 20:19:21.176231 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnn2p" event={"ID":"6b25c928-06b3-4bea-91cd-f1c609d1e785","Type":"ContainerStarted","Data":"6932cfce335e4510fca7fdde3cbce5c24c8df2383eda3d9b0c999de58e39ee6a"} Oct 05 20:19:21 crc kubenswrapper[4753]: I1005 20:19:21.322638 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gdpm6"] Oct 05 20:19:21 crc kubenswrapper[4753]: W1005 20:19:21.331924 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4bd315d_8dd0_4844_a675_ddc48827669d.slice/crio-c218653b90b41c8076dcfc2f9fdaa9edc406bfb9b52604ceeb2f7a7cc89747fb WatchSource:0}: Error finding container c218653b90b41c8076dcfc2f9fdaa9edc406bfb9b52604ceeb2f7a7cc89747fb: Status 404 returned error can't find the container with id c218653b90b41c8076dcfc2f9fdaa9edc406bfb9b52604ceeb2f7a7cc89747fb Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.183653 4753 generic.go:334] "Generic (PLEG): container finished" podID="a4bd315d-8dd0-4844-a675-ddc48827669d" containerID="1e2c6d4cbc8fd553804a22d325b59774ad15883b6737607bf306c600ddc4d008" exitCode=0 Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.183733 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdpm6" event={"ID":"a4bd315d-8dd0-4844-a675-ddc48827669d","Type":"ContainerDied","Data":"1e2c6d4cbc8fd553804a22d325b59774ad15883b6737607bf306c600ddc4d008"} Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.183773 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdpm6" event={"ID":"a4bd315d-8dd0-4844-a675-ddc48827669d","Type":"ContainerStarted","Data":"c218653b90b41c8076dcfc2f9fdaa9edc406bfb9b52604ceeb2f7a7cc89747fb"} Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.195529 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnn2p" event={"ID":"6b25c928-06b3-4bea-91cd-f1c609d1e785","Type":"ContainerStarted","Data":"1ad014bc4437342dacac3c44a73cf6d4ca0b210f542bb82b437d7ba722a22f46"} Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.725832 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nhg75"] Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.727215 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhg75" Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.730200 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.740640 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nhg75"] Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.767183 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86rg6\" (UniqueName: \"kubernetes.io/projected/7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8-kube-api-access-86rg6\") pod \"redhat-operators-nhg75\" (UID: \"7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8\") " pod="openshift-marketplace/redhat-operators-nhg75" Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.767345 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8-catalog-content\") pod \"redhat-operators-nhg75\" (UID: \"7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8\") " pod="openshift-marketplace/redhat-operators-nhg75" Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.767381 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8-utilities\") pod \"redhat-operators-nhg75\" (UID: \"7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8\") " pod="openshift-marketplace/redhat-operators-nhg75" Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.868851 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86rg6\" (UniqueName: \"kubernetes.io/projected/7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8-kube-api-access-86rg6\") pod \"redhat-operators-nhg75\" (UID: \"7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8\") " pod="openshift-marketplace/redhat-operators-nhg75" Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.868919 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8-catalog-content\") pod \"redhat-operators-nhg75\" (UID: \"7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8\") " pod="openshift-marketplace/redhat-operators-nhg75" Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.868949 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8-utilities\") pod \"redhat-operators-nhg75\" (UID: \"7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8\") " pod="openshift-marketplace/redhat-operators-nhg75" Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.869945 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8-utilities\") pod \"redhat-operators-nhg75\" (UID: \"7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8\") " pod="openshift-marketplace/redhat-operators-nhg75" Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.870006 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8-catalog-content\") pod \"redhat-operators-nhg75\" (UID: \"7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8\") " pod="openshift-marketplace/redhat-operators-nhg75" Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.890484 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86rg6\" (UniqueName: \"kubernetes.io/projected/7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8-kube-api-access-86rg6\") pod \"redhat-operators-nhg75\" (UID: \"7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8\") " pod="openshift-marketplace/redhat-operators-nhg75" Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.930389 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jq98s"] Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.931648 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jq98s" Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.937051 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.964016 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jq98s"] Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.970406 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pjct\" (UniqueName: \"kubernetes.io/projected/09240459-8b63-4037-b2d7-0f3a2e294835-kube-api-access-8pjct\") pod \"community-operators-jq98s\" (UID: \"09240459-8b63-4037-b2d7-0f3a2e294835\") " pod="openshift-marketplace/community-operators-jq98s" Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.970883 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09240459-8b63-4037-b2d7-0f3a2e294835-utilities\") pod \"community-operators-jq98s\" (UID: \"09240459-8b63-4037-b2d7-0f3a2e294835\") " pod="openshift-marketplace/community-operators-jq98s" Oct 05 20:19:22 crc kubenswrapper[4753]: I1005 20:19:22.971052 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09240459-8b63-4037-b2d7-0f3a2e294835-catalog-content\") pod \"community-operators-jq98s\" (UID: \"09240459-8b63-4037-b2d7-0f3a2e294835\") " pod="openshift-marketplace/community-operators-jq98s" Oct 05 20:19:23 crc kubenswrapper[4753]: I1005 20:19:23.059652 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhg75" Oct 05 20:19:23 crc kubenswrapper[4753]: I1005 20:19:23.071844 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09240459-8b63-4037-b2d7-0f3a2e294835-utilities\") pod \"community-operators-jq98s\" (UID: \"09240459-8b63-4037-b2d7-0f3a2e294835\") " pod="openshift-marketplace/community-operators-jq98s" Oct 05 20:19:23 crc kubenswrapper[4753]: I1005 20:19:23.072401 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09240459-8b63-4037-b2d7-0f3a2e294835-utilities\") pod \"community-operators-jq98s\" (UID: \"09240459-8b63-4037-b2d7-0f3a2e294835\") " pod="openshift-marketplace/community-operators-jq98s" Oct 05 20:19:23 crc kubenswrapper[4753]: I1005 20:19:23.072464 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09240459-8b63-4037-b2d7-0f3a2e294835-catalog-content\") pod \"community-operators-jq98s\" (UID: \"09240459-8b63-4037-b2d7-0f3a2e294835\") " pod="openshift-marketplace/community-operators-jq98s" Oct 05 20:19:23 crc kubenswrapper[4753]: I1005 20:19:23.072713 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09240459-8b63-4037-b2d7-0f3a2e294835-catalog-content\") pod \"community-operators-jq98s\" (UID: \"09240459-8b63-4037-b2d7-0f3a2e294835\") " pod="openshift-marketplace/community-operators-jq98s" Oct 05 20:19:23 crc kubenswrapper[4753]: I1005 20:19:23.072748 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pjct\" (UniqueName: \"kubernetes.io/projected/09240459-8b63-4037-b2d7-0f3a2e294835-kube-api-access-8pjct\") pod \"community-operators-jq98s\" (UID: \"09240459-8b63-4037-b2d7-0f3a2e294835\") " pod="openshift-marketplace/community-operators-jq98s" Oct 05 20:19:23 crc kubenswrapper[4753]: I1005 20:19:23.094041 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pjct\" (UniqueName: \"kubernetes.io/projected/09240459-8b63-4037-b2d7-0f3a2e294835-kube-api-access-8pjct\") pod \"community-operators-jq98s\" (UID: \"09240459-8b63-4037-b2d7-0f3a2e294835\") " pod="openshift-marketplace/community-operators-jq98s" Oct 05 20:19:23 crc kubenswrapper[4753]: I1005 20:19:23.209316 4753 generic.go:334] "Generic (PLEG): container finished" podID="a4bd315d-8dd0-4844-a675-ddc48827669d" containerID="03250c185e855aa0983d79381723b4723fcb2ca68229131743d5af230ddfd696" exitCode=0 Oct 05 20:19:23 crc kubenswrapper[4753]: I1005 20:19:23.209399 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdpm6" event={"ID":"a4bd315d-8dd0-4844-a675-ddc48827669d","Type":"ContainerDied","Data":"03250c185e855aa0983d79381723b4723fcb2ca68229131743d5af230ddfd696"} Oct 05 20:19:23 crc kubenswrapper[4753]: I1005 20:19:23.214662 4753 generic.go:334] "Generic (PLEG): container finished" podID="6b25c928-06b3-4bea-91cd-f1c609d1e785" containerID="1ad014bc4437342dacac3c44a73cf6d4ca0b210f542bb82b437d7ba722a22f46" exitCode=0 Oct 05 20:19:23 crc kubenswrapper[4753]: I1005 20:19:23.214701 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnn2p" event={"ID":"6b25c928-06b3-4bea-91cd-f1c609d1e785","Type":"ContainerDied","Data":"1ad014bc4437342dacac3c44a73cf6d4ca0b210f542bb82b437d7ba722a22f46"} Oct 05 20:19:23 crc kubenswrapper[4753]: I1005 20:19:23.253084 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jq98s" Oct 05 20:19:23 crc kubenswrapper[4753]: I1005 20:19:23.458704 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nhg75"] Oct 05 20:19:23 crc kubenswrapper[4753]: W1005 20:19:23.462117 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f31a46b_3ec3_42f3_94e4_a5fe09e0a2a8.slice/crio-02fee7e7cdb7eecc4f439c78766036e6944e47414372c533e6f57542f4b9c34e WatchSource:0}: Error finding container 02fee7e7cdb7eecc4f439c78766036e6944e47414372c533e6f57542f4b9c34e: Status 404 returned error can't find the container with id 02fee7e7cdb7eecc4f439c78766036e6944e47414372c533e6f57542f4b9c34e Oct 05 20:19:23 crc kubenswrapper[4753]: I1005 20:19:23.659475 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jq98s"] Oct 05 20:19:23 crc kubenswrapper[4753]: W1005 20:19:23.674869 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09240459_8b63_4037_b2d7_0f3a2e294835.slice/crio-28e2e5cd80efd66d38bdadee1e19a33f5e5ef35622957ebb50a82841881da245 WatchSource:0}: Error finding container 28e2e5cd80efd66d38bdadee1e19a33f5e5ef35622957ebb50a82841881da245: Status 404 returned error can't find the container with id 28e2e5cd80efd66d38bdadee1e19a33f5e5ef35622957ebb50a82841881da245 Oct 05 20:19:24 crc kubenswrapper[4753]: I1005 20:19:24.221331 4753 generic.go:334] "Generic (PLEG): container finished" podID="7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8" containerID="3282b6b3411b50485ca460a9ddc9bab25ee0e6b2ae197b365a29425ecfd51dbb" exitCode=0 Oct 05 20:19:24 crc kubenswrapper[4753]: I1005 20:19:24.221383 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhg75" event={"ID":"7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8","Type":"ContainerDied","Data":"3282b6b3411b50485ca460a9ddc9bab25ee0e6b2ae197b365a29425ecfd51dbb"} Oct 05 20:19:24 crc kubenswrapper[4753]: I1005 20:19:24.221431 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhg75" event={"ID":"7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8","Type":"ContainerStarted","Data":"02fee7e7cdb7eecc4f439c78766036e6944e47414372c533e6f57542f4b9c34e"} Oct 05 20:19:24 crc kubenswrapper[4753]: I1005 20:19:24.225798 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdpm6" event={"ID":"a4bd315d-8dd0-4844-a675-ddc48827669d","Type":"ContainerStarted","Data":"253fc15aec9881138393bf50f9aeae904c3a5cf81c09c1850af0e25137979e36"} Oct 05 20:19:24 crc kubenswrapper[4753]: I1005 20:19:24.228464 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnn2p" event={"ID":"6b25c928-06b3-4bea-91cd-f1c609d1e785","Type":"ContainerStarted","Data":"ba3a02e55be8e9a0eb8284528155b09967a2cbdd02e1d0c8a44b28b473540046"} Oct 05 20:19:24 crc kubenswrapper[4753]: I1005 20:19:24.229935 4753 generic.go:334] "Generic (PLEG): container finished" podID="09240459-8b63-4037-b2d7-0f3a2e294835" containerID="aad0559b7eaa1fae07ae135ab0c8849773cc2d292e12eafe56f5c0030e12b25c" exitCode=0 Oct 05 20:19:24 crc kubenswrapper[4753]: I1005 20:19:24.229965 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jq98s" event={"ID":"09240459-8b63-4037-b2d7-0f3a2e294835","Type":"ContainerDied","Data":"aad0559b7eaa1fae07ae135ab0c8849773cc2d292e12eafe56f5c0030e12b25c"} Oct 05 20:19:24 crc kubenswrapper[4753]: I1005 20:19:24.229991 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jq98s" event={"ID":"09240459-8b63-4037-b2d7-0f3a2e294835","Type":"ContainerStarted","Data":"28e2e5cd80efd66d38bdadee1e19a33f5e5ef35622957ebb50a82841881da245"} Oct 05 20:19:24 crc kubenswrapper[4753]: I1005 20:19:24.274509 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qnn2p" podStartSLOduration=1.795557934 podStartE2EDuration="4.274482849s" podCreationTimestamp="2025-10-05 20:19:20 +0000 UTC" firstStartedPulling="2025-10-05 20:19:21.178006102 +0000 UTC m=+270.026334334" lastFinishedPulling="2025-10-05 20:19:23.656931017 +0000 UTC m=+272.505259249" observedRunningTime="2025-10-05 20:19:24.271476919 +0000 UTC m=+273.119805151" watchObservedRunningTime="2025-10-05 20:19:24.274482849 +0000 UTC m=+273.122811071" Oct 05 20:19:24 crc kubenswrapper[4753]: I1005 20:19:24.295447 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gdpm6" podStartSLOduration=2.869033512 podStartE2EDuration="4.295412855s" podCreationTimestamp="2025-10-05 20:19:20 +0000 UTC" firstStartedPulling="2025-10-05 20:19:22.186708227 +0000 UTC m=+271.035036469" lastFinishedPulling="2025-10-05 20:19:23.61308758 +0000 UTC m=+272.461415812" observedRunningTime="2025-10-05 20:19:24.292746767 +0000 UTC m=+273.141074999" watchObservedRunningTime="2025-10-05 20:19:24.295412855 +0000 UTC m=+273.143741087" Oct 05 20:19:25 crc kubenswrapper[4753]: I1005 20:19:25.238410 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhg75" event={"ID":"7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8","Type":"ContainerStarted","Data":"b5e77b6f7bcdbe9d6ee1fa30103da720f2b20296a2ca4815313ef4f6eac40157"} Oct 05 20:19:25 crc kubenswrapper[4753]: I1005 20:19:25.241071 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jq98s" event={"ID":"09240459-8b63-4037-b2d7-0f3a2e294835","Type":"ContainerStarted","Data":"8e56c8270fd4a2a80b33942dfe97a86d3e23b3baddf4b8547e324471e3357031"} Oct 05 20:19:26 crc kubenswrapper[4753]: I1005 20:19:26.249018 4753 generic.go:334] "Generic (PLEG): container finished" podID="09240459-8b63-4037-b2d7-0f3a2e294835" containerID="8e56c8270fd4a2a80b33942dfe97a86d3e23b3baddf4b8547e324471e3357031" exitCode=0 Oct 05 20:19:26 crc kubenswrapper[4753]: I1005 20:19:26.249104 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jq98s" event={"ID":"09240459-8b63-4037-b2d7-0f3a2e294835","Type":"ContainerDied","Data":"8e56c8270fd4a2a80b33942dfe97a86d3e23b3baddf4b8547e324471e3357031"} Oct 05 20:19:26 crc kubenswrapper[4753]: I1005 20:19:26.255314 4753 generic.go:334] "Generic (PLEG): container finished" podID="7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8" containerID="b5e77b6f7bcdbe9d6ee1fa30103da720f2b20296a2ca4815313ef4f6eac40157" exitCode=0 Oct 05 20:19:26 crc kubenswrapper[4753]: I1005 20:19:26.255359 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhg75" event={"ID":"7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8","Type":"ContainerDied","Data":"b5e77b6f7bcdbe9d6ee1fa30103da720f2b20296a2ca4815313ef4f6eac40157"} Oct 05 20:19:28 crc kubenswrapper[4753]: I1005 20:19:28.270234 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jq98s" event={"ID":"09240459-8b63-4037-b2d7-0f3a2e294835","Type":"ContainerStarted","Data":"bf35d3ee60fe8915138a5071f67bd686aac4f05295b7ac5b23211a65be172ae5"} Oct 05 20:19:28 crc kubenswrapper[4753]: I1005 20:19:28.275129 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhg75" event={"ID":"7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8","Type":"ContainerStarted","Data":"ea69508a11bf8309b46bae07ae9c9cdf2562e49f35cf9cbba524afc9938c3bd9"} Oct 05 20:19:28 crc kubenswrapper[4753]: I1005 20:19:28.292213 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jq98s" podStartSLOduration=3.78503366 podStartE2EDuration="6.292197325s" podCreationTimestamp="2025-10-05 20:19:22 +0000 UTC" firstStartedPulling="2025-10-05 20:19:24.233506517 +0000 UTC m=+273.081834749" lastFinishedPulling="2025-10-05 20:19:26.740670162 +0000 UTC m=+275.588998414" observedRunningTime="2025-10-05 20:19:28.289793765 +0000 UTC m=+277.138121997" watchObservedRunningTime="2025-10-05 20:19:28.292197325 +0000 UTC m=+277.140525557" Oct 05 20:19:28 crc kubenswrapper[4753]: I1005 20:19:28.307902 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nhg75" podStartSLOduration=3.867394747 podStartE2EDuration="6.307884666s" podCreationTimestamp="2025-10-05 20:19:22 +0000 UTC" firstStartedPulling="2025-10-05 20:19:24.225760949 +0000 UTC m=+273.074089181" lastFinishedPulling="2025-10-05 20:19:26.666250858 +0000 UTC m=+275.514579100" observedRunningTime="2025-10-05 20:19:28.307627018 +0000 UTC m=+277.155955270" watchObservedRunningTime="2025-10-05 20:19:28.307884666 +0000 UTC m=+277.156212898" Oct 05 20:19:30 crc kubenswrapper[4753]: I1005 20:19:30.643967 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qnn2p" Oct 05 20:19:30 crc kubenswrapper[4753]: I1005 20:19:30.644373 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qnn2p" Oct 05 20:19:30 crc kubenswrapper[4753]: I1005 20:19:30.681759 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qnn2p" Oct 05 20:19:31 crc kubenswrapper[4753]: I1005 20:19:31.140895 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gdpm6" Oct 05 20:19:31 crc kubenswrapper[4753]: I1005 20:19:31.141077 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gdpm6" Oct 05 20:19:31 crc kubenswrapper[4753]: I1005 20:19:31.186046 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gdpm6" Oct 05 20:19:31 crc kubenswrapper[4753]: I1005 20:19:31.322860 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gdpm6" Oct 05 20:19:31 crc kubenswrapper[4753]: I1005 20:19:31.338078 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qnn2p" Oct 05 20:19:33 crc kubenswrapper[4753]: I1005 20:19:33.060038 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nhg75" Oct 05 20:19:33 crc kubenswrapper[4753]: I1005 20:19:33.061769 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nhg75" Oct 05 20:19:33 crc kubenswrapper[4753]: I1005 20:19:33.096196 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nhg75" Oct 05 20:19:33 crc kubenswrapper[4753]: I1005 20:19:33.253544 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jq98s" Oct 05 20:19:33 crc kubenswrapper[4753]: I1005 20:19:33.253613 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jq98s" Oct 05 20:19:33 crc kubenswrapper[4753]: I1005 20:19:33.318859 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jq98s" Oct 05 20:19:33 crc kubenswrapper[4753]: I1005 20:19:33.364636 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nhg75" Oct 05 20:19:33 crc kubenswrapper[4753]: I1005 20:19:33.378229 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jq98s" Oct 05 20:20:34 crc kubenswrapper[4753]: I1005 20:20:34.490184 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:20:34 crc kubenswrapper[4753]: I1005 20:20:34.490995 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:21:04 crc kubenswrapper[4753]: I1005 20:21:04.490530 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:21:04 crc kubenswrapper[4753]: I1005 20:21:04.491632 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:21:34 crc kubenswrapper[4753]: I1005 20:21:34.490384 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:21:34 crc kubenswrapper[4753]: I1005 20:21:34.492048 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:21:34 crc kubenswrapper[4753]: I1005 20:21:34.492257 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:21:34 crc kubenswrapper[4753]: I1005 20:21:34.493039 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d036ff9dbc389fc79b9fd8e46b1fb2f4c9ac8b14987017b53068ab8b2e6b56cb"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 20:21:34 crc kubenswrapper[4753]: I1005 20:21:34.493240 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://d036ff9dbc389fc79b9fd8e46b1fb2f4c9ac8b14987017b53068ab8b2e6b56cb" gracePeriod=600 Oct 05 20:21:35 crc kubenswrapper[4753]: I1005 20:21:35.037674 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="d036ff9dbc389fc79b9fd8e46b1fb2f4c9ac8b14987017b53068ab8b2e6b56cb" exitCode=0 Oct 05 20:21:35 crc kubenswrapper[4753]: I1005 20:21:35.037763 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"d036ff9dbc389fc79b9fd8e46b1fb2f4c9ac8b14987017b53068ab8b2e6b56cb"} Oct 05 20:21:35 crc kubenswrapper[4753]: I1005 20:21:35.038234 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"5ebcd29664463350c05d6256aae98be7654b14c206df5bbe017e8126dff22fad"} Oct 05 20:21:35 crc kubenswrapper[4753]: I1005 20:21:35.038262 4753 scope.go:117] "RemoveContainer" containerID="f4e79e247970335f89b8f52baa9ff77046d6b6268b27ced5897daf482809e4ef" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.454496 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xhdlf"] Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.456465 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.478190 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xhdlf"] Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.562559 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv2zt\" (UniqueName: \"kubernetes.io/projected/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-kube-api-access-nv2zt\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.562629 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-trusted-ca\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.562682 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-registry-certificates\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.562715 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.562753 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-bound-sa-token\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.562775 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.562838 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-registry-tls\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.562877 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.586489 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.663746 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.663796 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-registry-tls\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.663825 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.663861 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv2zt\" (UniqueName: \"kubernetes.io/projected/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-kube-api-access-nv2zt\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.663880 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-trusted-ca\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.663897 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-registry-certificates\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.663930 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-bound-sa-token\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.664599 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.665449 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-registry-certificates\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.665588 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-trusted-ca\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.668771 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-registry-tls\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.669004 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.689714 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-bound-sa-token\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.692033 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv2zt\" (UniqueName: \"kubernetes.io/projected/cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e-kube-api-access-nv2zt\") pod \"image-registry-66df7c8f76-xhdlf\" (UID: \"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e\") " pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.770692 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:17 crc kubenswrapper[4753]: I1005 20:23:17.940876 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xhdlf"] Oct 05 20:23:18 crc kubenswrapper[4753]: I1005 20:23:18.641598 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" event={"ID":"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e","Type":"ContainerStarted","Data":"cc16b32a14309ca860d58182be888c417271f98739a463f7bc8fcc630843182c"} Oct 05 20:23:18 crc kubenswrapper[4753]: I1005 20:23:18.643085 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:18 crc kubenswrapper[4753]: I1005 20:23:18.643244 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" event={"ID":"cb63b82d-9a4e-4690-adf7-c0eb0cafdf3e","Type":"ContainerStarted","Data":"0710288c82253ebeb445aced9ae4acd7b5394e9997956927961ef7599970d141"} Oct 05 20:23:18 crc kubenswrapper[4753]: I1005 20:23:18.661344 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" podStartSLOduration=1.661322081 podStartE2EDuration="1.661322081s" podCreationTimestamp="2025-10-05 20:23:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:23:18.658352909 +0000 UTC m=+507.506681201" watchObservedRunningTime="2025-10-05 20:23:18.661322081 +0000 UTC m=+507.509650323" Oct 05 20:23:34 crc kubenswrapper[4753]: I1005 20:23:34.490494 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:23:34 crc kubenswrapper[4753]: I1005 20:23:34.491277 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:23:37 crc kubenswrapper[4753]: I1005 20:23:37.780575 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-xhdlf" Oct 05 20:23:37 crc kubenswrapper[4753]: I1005 20:23:37.863130 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rr2d7"] Oct 05 20:23:51 crc kubenswrapper[4753]: I1005 20:23:51.971392 4753 scope.go:117] "RemoveContainer" containerID="ce4a0acf3a94c7591afed0b0bdc02191bcd5b70685d52443e5a62e12d609266c" Oct 05 20:24:02 crc kubenswrapper[4753]: I1005 20:24:02.921115 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" podUID="5da3ed48-9a09-47d3-9bcb-f572a962b5fb" containerName="registry" containerID="cri-o://932de19ce33539bd124888c4457db35d6c378cdd5b775c9a955f241d45cf5cfe" gracePeriod=30 Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.300334 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.419614 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-ca-trust-extracted\") pod \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.419679 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-installation-pull-secrets\") pod \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.419798 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-registry-certificates\") pod \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.419828 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-registry-tls\") pod \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.419858 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-bound-sa-token\") pod \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.419916 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2hd6\" (UniqueName: \"kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-kube-api-access-p2hd6\") pod \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.420087 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.420181 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-trusted-ca\") pod \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\" (UID: \"5da3ed48-9a09-47d3-9bcb-f572a962b5fb\") " Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.421016 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5da3ed48-9a09-47d3-9bcb-f572a962b5fb" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.423843 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5da3ed48-9a09-47d3-9bcb-f572a962b5fb" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.428207 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-kube-api-access-p2hd6" (OuterVolumeSpecName: "kube-api-access-p2hd6") pod "5da3ed48-9a09-47d3-9bcb-f572a962b5fb" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb"). InnerVolumeSpecName "kube-api-access-p2hd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.436205 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5da3ed48-9a09-47d3-9bcb-f572a962b5fb" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.436841 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5da3ed48-9a09-47d3-9bcb-f572a962b5fb" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.437380 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "5da3ed48-9a09-47d3-9bcb-f572a962b5fb" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.437442 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5da3ed48-9a09-47d3-9bcb-f572a962b5fb" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.441834 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5da3ed48-9a09-47d3-9bcb-f572a962b5fb" (UID: "5da3ed48-9a09-47d3-9bcb-f572a962b5fb"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.522460 4753 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.522507 4753 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.522530 4753 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.522549 4753 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.522566 4753 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.522586 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2hd6\" (UniqueName: \"kubernetes.io/projected/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-kube-api-access-p2hd6\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.522602 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5da3ed48-9a09-47d3-9bcb-f572a962b5fb-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.914838 4753 generic.go:334] "Generic (PLEG): container finished" podID="5da3ed48-9a09-47d3-9bcb-f572a962b5fb" containerID="932de19ce33539bd124888c4457db35d6c378cdd5b775c9a955f241d45cf5cfe" exitCode=0 Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.914966 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.915167 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" event={"ID":"5da3ed48-9a09-47d3-9bcb-f572a962b5fb","Type":"ContainerDied","Data":"932de19ce33539bd124888c4457db35d6c378cdd5b775c9a955f241d45cf5cfe"} Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.915309 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rr2d7" event={"ID":"5da3ed48-9a09-47d3-9bcb-f572a962b5fb","Type":"ContainerDied","Data":"c5ed0dbfd78c2a823622a02e623e1838096ac86b8576976e195764628fd165a3"} Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.915375 4753 scope.go:117] "RemoveContainer" containerID="932de19ce33539bd124888c4457db35d6c378cdd5b775c9a955f241d45cf5cfe" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.938603 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rr2d7"] Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.942165 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rr2d7"] Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.953071 4753 scope.go:117] "RemoveContainer" containerID="932de19ce33539bd124888c4457db35d6c378cdd5b775c9a955f241d45cf5cfe" Oct 05 20:24:03 crc kubenswrapper[4753]: E1005 20:24:03.954598 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"932de19ce33539bd124888c4457db35d6c378cdd5b775c9a955f241d45cf5cfe\": container with ID starting with 932de19ce33539bd124888c4457db35d6c378cdd5b775c9a955f241d45cf5cfe not found: ID does not exist" containerID="932de19ce33539bd124888c4457db35d6c378cdd5b775c9a955f241d45cf5cfe" Oct 05 20:24:03 crc kubenswrapper[4753]: I1005 20:24:03.954629 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"932de19ce33539bd124888c4457db35d6c378cdd5b775c9a955f241d45cf5cfe"} err="failed to get container status \"932de19ce33539bd124888c4457db35d6c378cdd5b775c9a955f241d45cf5cfe\": rpc error: code = NotFound desc = could not find container \"932de19ce33539bd124888c4457db35d6c378cdd5b775c9a955f241d45cf5cfe\": container with ID starting with 932de19ce33539bd124888c4457db35d6c378cdd5b775c9a955f241d45cf5cfe not found: ID does not exist" Oct 05 20:24:04 crc kubenswrapper[4753]: I1005 20:24:04.490756 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:24:04 crc kubenswrapper[4753]: I1005 20:24:04.490841 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:24:05 crc kubenswrapper[4753]: I1005 20:24:05.879187 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5da3ed48-9a09-47d3-9bcb-f572a962b5fb" path="/var/lib/kubelet/pods/5da3ed48-9a09-47d3-9bcb-f572a962b5fb/volumes" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.601274 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qvslf"] Oct 05 20:24:18 crc kubenswrapper[4753]: E1005 20:24:18.601964 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da3ed48-9a09-47d3-9bcb-f572a962b5fb" containerName="registry" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.601977 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da3ed48-9a09-47d3-9bcb-f572a962b5fb" containerName="registry" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.602079 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da3ed48-9a09-47d3-9bcb-f572a962b5fb" containerName="registry" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.602484 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-qvslf" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.604900 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-54vmd"] Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.606739 4753 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-pjv8z" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.606934 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.609239 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-54vmd" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.609515 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.617582 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qvslf"] Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.619964 4753 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-gk5bd" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.623267 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-shkg2"] Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.624090 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-shkg2" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.625932 4753 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-bt8v6" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.635831 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pphqn\" (UniqueName: \"kubernetes.io/projected/3749909c-e0a6-4a84-8e16-e8d104f8bb29-kube-api-access-pphqn\") pod \"cert-manager-cainjector-7f985d654d-qvslf\" (UID: \"3749909c-e0a6-4a84-8e16-e8d104f8bb29\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qvslf" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.635877 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64mth\" (UniqueName: \"kubernetes.io/projected/183d1891-ba1d-4ce0-83bd-9a547d099416-kube-api-access-64mth\") pod \"cert-manager-webhook-5655c58dd6-shkg2\" (UID: \"183d1891-ba1d-4ce0-83bd-9a547d099416\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-shkg2" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.635904 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xx8v\" (UniqueName: \"kubernetes.io/projected/b8aa872e-b15b-458f-8bf4-0057a25d5d43-kube-api-access-5xx8v\") pod \"cert-manager-5b446d88c5-54vmd\" (UID: \"b8aa872e-b15b-458f-8bf4-0057a25d5d43\") " pod="cert-manager/cert-manager-5b446d88c5-54vmd" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.656299 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-54vmd"] Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.657080 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-shkg2"] Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.737097 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xx8v\" (UniqueName: \"kubernetes.io/projected/b8aa872e-b15b-458f-8bf4-0057a25d5d43-kube-api-access-5xx8v\") pod \"cert-manager-5b446d88c5-54vmd\" (UID: \"b8aa872e-b15b-458f-8bf4-0057a25d5d43\") " pod="cert-manager/cert-manager-5b446d88c5-54vmd" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.737203 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pphqn\" (UniqueName: \"kubernetes.io/projected/3749909c-e0a6-4a84-8e16-e8d104f8bb29-kube-api-access-pphqn\") pod \"cert-manager-cainjector-7f985d654d-qvslf\" (UID: \"3749909c-e0a6-4a84-8e16-e8d104f8bb29\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qvslf" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.737567 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64mth\" (UniqueName: \"kubernetes.io/projected/183d1891-ba1d-4ce0-83bd-9a547d099416-kube-api-access-64mth\") pod \"cert-manager-webhook-5655c58dd6-shkg2\" (UID: \"183d1891-ba1d-4ce0-83bd-9a547d099416\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-shkg2" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.755830 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64mth\" (UniqueName: \"kubernetes.io/projected/183d1891-ba1d-4ce0-83bd-9a547d099416-kube-api-access-64mth\") pod \"cert-manager-webhook-5655c58dd6-shkg2\" (UID: \"183d1891-ba1d-4ce0-83bd-9a547d099416\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-shkg2" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.758833 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pphqn\" (UniqueName: \"kubernetes.io/projected/3749909c-e0a6-4a84-8e16-e8d104f8bb29-kube-api-access-pphqn\") pod \"cert-manager-cainjector-7f985d654d-qvslf\" (UID: \"3749909c-e0a6-4a84-8e16-e8d104f8bb29\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qvslf" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.770582 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xx8v\" (UniqueName: \"kubernetes.io/projected/b8aa872e-b15b-458f-8bf4-0057a25d5d43-kube-api-access-5xx8v\") pod \"cert-manager-5b446d88c5-54vmd\" (UID: \"b8aa872e-b15b-458f-8bf4-0057a25d5d43\") " pod="cert-manager/cert-manager-5b446d88c5-54vmd" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.918745 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-qvslf" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.924130 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-54vmd" Oct 05 20:24:18 crc kubenswrapper[4753]: I1005 20:24:18.940047 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-shkg2" Oct 05 20:24:19 crc kubenswrapper[4753]: I1005 20:24:19.397031 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-54vmd"] Oct 05 20:24:19 crc kubenswrapper[4753]: I1005 20:24:19.410260 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 05 20:24:19 crc kubenswrapper[4753]: I1005 20:24:19.443395 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qvslf"] Oct 05 20:24:19 crc kubenswrapper[4753]: I1005 20:24:19.461711 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-shkg2"] Oct 05 20:24:20 crc kubenswrapper[4753]: I1005 20:24:20.031124 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-54vmd" event={"ID":"b8aa872e-b15b-458f-8bf4-0057a25d5d43","Type":"ContainerStarted","Data":"d178b02549fc796e7dbc54002d3da322fa18e2b8fbe048d11e9b2ce83be574f4"} Oct 05 20:24:20 crc kubenswrapper[4753]: I1005 20:24:20.032954 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-qvslf" event={"ID":"3749909c-e0a6-4a84-8e16-e8d104f8bb29","Type":"ContainerStarted","Data":"35817bd5477f4eabe96b774d81c8e429aa31196e4555969a43c2ca82e9dddb1e"} Oct 05 20:24:20 crc kubenswrapper[4753]: I1005 20:24:20.034656 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-shkg2" event={"ID":"183d1891-ba1d-4ce0-83bd-9a547d099416","Type":"ContainerStarted","Data":"0bd833df4ed8aca22b26ab030b6d047b7c7ed5110a6afa3d5ddb8fc6d84a2d40"} Oct 05 20:24:23 crc kubenswrapper[4753]: I1005 20:24:23.048814 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-54vmd" event={"ID":"b8aa872e-b15b-458f-8bf4-0057a25d5d43","Type":"ContainerStarted","Data":"33b0ba058127ecd76232238915ed74c525e2cfa7d3b026a59d3d972b4345054c"} Oct 05 20:24:23 crc kubenswrapper[4753]: I1005 20:24:23.050757 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-qvslf" event={"ID":"3749909c-e0a6-4a84-8e16-e8d104f8bb29","Type":"ContainerStarted","Data":"657df27d2752386c63e698fdf3f8f20967d538dbd57b1a87250b3797cd330774"} Oct 05 20:24:23 crc kubenswrapper[4753]: I1005 20:24:23.052071 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-shkg2" event={"ID":"183d1891-ba1d-4ce0-83bd-9a547d099416","Type":"ContainerStarted","Data":"7a48488db0cbe1aecc194b5118dde6d5c8599f5dc372a3aa18984c05a2848b64"} Oct 05 20:24:23 crc kubenswrapper[4753]: I1005 20:24:23.052191 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-shkg2" Oct 05 20:24:23 crc kubenswrapper[4753]: I1005 20:24:23.068831 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-54vmd" podStartSLOduration=2.098787671 podStartE2EDuration="5.068811997s" podCreationTimestamp="2025-10-05 20:24:18 +0000 UTC" firstStartedPulling="2025-10-05 20:24:19.410016675 +0000 UTC m=+568.258344908" lastFinishedPulling="2025-10-05 20:24:22.380040992 +0000 UTC m=+571.228369234" observedRunningTime="2025-10-05 20:24:23.06502022 +0000 UTC m=+571.913348452" watchObservedRunningTime="2025-10-05 20:24:23.068811997 +0000 UTC m=+571.917140239" Oct 05 20:24:23 crc kubenswrapper[4753]: I1005 20:24:23.127668 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-shkg2" podStartSLOduration=2.157257154 podStartE2EDuration="5.127650613s" podCreationTimestamp="2025-10-05 20:24:18 +0000 UTC" firstStartedPulling="2025-10-05 20:24:19.470526232 +0000 UTC m=+568.318854474" lastFinishedPulling="2025-10-05 20:24:22.440919671 +0000 UTC m=+571.289247933" observedRunningTime="2025-10-05 20:24:23.124122504 +0000 UTC m=+571.972450736" watchObservedRunningTime="2025-10-05 20:24:23.127650613 +0000 UTC m=+571.975978845" Oct 05 20:24:23 crc kubenswrapper[4753]: I1005 20:24:23.128841 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-qvslf" podStartSLOduration=2.332943304 podStartE2EDuration="5.12883487s" podCreationTimestamp="2025-10-05 20:24:18 +0000 UTC" firstStartedPulling="2025-10-05 20:24:19.450483741 +0000 UTC m=+568.298811973" lastFinishedPulling="2025-10-05 20:24:22.246375287 +0000 UTC m=+571.094703539" observedRunningTime="2025-10-05 20:24:23.091164401 +0000 UTC m=+571.939492623" watchObservedRunningTime="2025-10-05 20:24:23.12883487 +0000 UTC m=+571.977163102" Oct 05 20:24:28 crc kubenswrapper[4753]: I1005 20:24:28.924021 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-htbfn"] Oct 05 20:24:28 crc kubenswrapper[4753]: I1005 20:24:28.926131 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="northd" containerID="cri-o://80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d" gracePeriod=30 Oct 05 20:24:28 crc kubenswrapper[4753]: I1005 20:24:28.926405 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="sbdb" containerID="cri-o://f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642" gracePeriod=30 Oct 05 20:24:28 crc kubenswrapper[4753]: I1005 20:24:28.926494 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovn-acl-logging" containerID="cri-o://2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c" gracePeriod=30 Oct 05 20:24:28 crc kubenswrapper[4753]: I1005 20:24:28.926540 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="nbdb" containerID="cri-o://9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd" gracePeriod=30 Oct 05 20:24:28 crc kubenswrapper[4753]: I1005 20:24:28.926343 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="kube-rbac-proxy-node" containerID="cri-o://ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243" gracePeriod=30 Oct 05 20:24:28 crc kubenswrapper[4753]: I1005 20:24:28.926458 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58" gracePeriod=30 Oct 05 20:24:28 crc kubenswrapper[4753]: I1005 20:24:28.926097 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovn-controller" containerID="cri-o://f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4" gracePeriod=30 Oct 05 20:24:28 crc kubenswrapper[4753]: I1005 20:24:28.975474 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-shkg2" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.010401 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovnkube-controller" containerID="cri-o://ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11" gracePeriod=30 Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.101764 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovnkube-controller/3.log" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.115086 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovn-acl-logging/0.log" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.121172 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovn-controller/0.log" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.121739 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerID="c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58" exitCode=0 Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.121767 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerID="ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243" exitCode=0 Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.121776 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerID="2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c" exitCode=143 Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.121832 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerDied","Data":"c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58"} Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.121861 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerDied","Data":"ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243"} Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.121873 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerDied","Data":"2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c"} Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.128972 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr5q8_8a6cead6-0872-4b49-a08c-529805f646f2/kube-multus/2.log" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.130268 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr5q8_8a6cead6-0872-4b49-a08c-529805f646f2/kube-multus/1.log" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.130300 4753 generic.go:334] "Generic (PLEG): container finished" podID="8a6cead6-0872-4b49-a08c-529805f646f2" containerID="9909c718bf28ac2a716f30feb9dbce15eba410ec607a82b1fa8faf6328b099a0" exitCode=2 Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.130328 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr5q8" event={"ID":"8a6cead6-0872-4b49-a08c-529805f646f2","Type":"ContainerDied","Data":"9909c718bf28ac2a716f30feb9dbce15eba410ec607a82b1fa8faf6328b099a0"} Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.130357 4753 scope.go:117] "RemoveContainer" containerID="d7ac7d15272eb1688a6a443c556a4ef57930c430084243bdff7df0d017044402" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.130780 4753 scope.go:117] "RemoveContainer" containerID="9909c718bf28ac2a716f30feb9dbce15eba410ec607a82b1fa8faf6328b099a0" Oct 05 20:24:29 crc kubenswrapper[4753]: E1005 20:24:29.131006 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zr5q8_openshift-multus(8a6cead6-0872-4b49-a08c-529805f646f2)\"" pod="openshift-multus/multus-zr5q8" podUID="8a6cead6-0872-4b49-a08c-529805f646f2" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.667348 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovnkube-controller/3.log" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.670897 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovn-acl-logging/0.log" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.671887 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovn-controller/0.log" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.672254 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.714824 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-run-netns\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.714882 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-slash\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.714901 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-cni-netd\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.714923 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-cni-bin\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.714956 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovnkube-script-lib\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.714995 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-env-overrides\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715035 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715070 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovnkube-config\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715099 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-kubelet\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715119 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-log-socket\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715163 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovn-node-metrics-cert\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715189 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-node-log\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715220 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-systemd\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715251 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-openvswitch\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715378 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-run-ovn-kubernetes\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715413 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-systemd-units\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715436 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-var-lib-openvswitch\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715469 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-ovn\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715499 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-etc-openvswitch\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715529 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7j8m\" (UniqueName: \"kubernetes.io/projected/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-kube-api-access-k7j8m\") pod \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\" (UID: \"fa1e6bd4-ce05-4757-bab2-6addb9d0111e\") " Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715793 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715858 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715918 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-slash" (OuterVolumeSpecName: "host-slash") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.715996 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.716019 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.716041 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.716063 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.716396 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.716446 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.716471 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.716496 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-node-log" (OuterVolumeSpecName: "node-log") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.716519 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.716565 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.716590 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.716673 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.716782 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-log-socket" (OuterVolumeSpecName: "log-socket") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.717061 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.722516 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.722778 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-kube-api-access-k7j8m" (OuterVolumeSpecName: "kube-api-access-k7j8m") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "kube-api-access-k7j8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.738443 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fa1e6bd4-ce05-4757-bab2-6addb9d0111e" (UID: "fa1e6bd4-ce05-4757-bab2-6addb9d0111e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741079 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-stmvz"] Oct 05 20:24:29 crc kubenswrapper[4753]: E1005 20:24:29.741372 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovn-acl-logging" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741392 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovn-acl-logging" Oct 05 20:24:29 crc kubenswrapper[4753]: E1005 20:24:29.741403 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="nbdb" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741410 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="nbdb" Oct 05 20:24:29 crc kubenswrapper[4753]: E1005 20:24:29.741423 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="sbdb" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741429 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="sbdb" Oct 05 20:24:29 crc kubenswrapper[4753]: E1005 20:24:29.741438 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovnkube-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741445 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovnkube-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: E1005 20:24:29.741455 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="kube-rbac-proxy-ovn-metrics" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741462 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="kube-rbac-proxy-ovn-metrics" Oct 05 20:24:29 crc kubenswrapper[4753]: E1005 20:24:29.741475 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovnkube-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741482 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovnkube-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: E1005 20:24:29.741490 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovnkube-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741496 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovnkube-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: E1005 20:24:29.741507 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovnkube-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741514 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovnkube-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: E1005 20:24:29.741527 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="kubecfg-setup" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741534 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="kubecfg-setup" Oct 05 20:24:29 crc kubenswrapper[4753]: E1005 20:24:29.741545 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="northd" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741551 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="northd" Oct 05 20:24:29 crc kubenswrapper[4753]: E1005 20:24:29.741559 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="kube-rbac-proxy-node" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741567 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="kube-rbac-proxy-node" Oct 05 20:24:29 crc kubenswrapper[4753]: E1005 20:24:29.741579 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovn-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741585 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovn-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741690 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovnkube-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741699 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="sbdb" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741711 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="kube-rbac-proxy-node" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741724 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="nbdb" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741732 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovn-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741740 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="northd" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741750 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovnkube-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741758 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovn-acl-logging" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741769 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="kube-rbac-proxy-ovn-metrics" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741779 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovnkube-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: E1005 20:24:29.741889 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovnkube-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741899 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovnkube-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.741998 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovnkube-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.742229 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerName="ovnkube-controller" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.744033 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.816419 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-systemd-units\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.816466 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-log-socket\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.816491 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.816513 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b82e45fc-3c57-447f-96f3-06498b059364-env-overrides\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.816631 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b82e45fc-3c57-447f-96f3-06498b059364-ovn-node-metrics-cert\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.816706 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b82e45fc-3c57-447f-96f3-06498b059364-ovnkube-config\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.816732 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-run-netns\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.816751 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-run-ovn\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.816780 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-run-systemd\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.816822 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-slash\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.816843 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-run-openvswitch\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.816873 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-run-ovn-kubernetes\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.816892 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq4fw\" (UniqueName: \"kubernetes.io/projected/b82e45fc-3c57-447f-96f3-06498b059364-kube-api-access-bq4fw\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.816919 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-kubelet\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.816971 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b82e45fc-3c57-447f-96f3-06498b059364-ovnkube-script-lib\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.816999 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-cni-bin\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817034 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-etc-openvswitch\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817053 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-cni-netd\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817074 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-node-log\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817094 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-var-lib-openvswitch\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817172 4753 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817185 4753 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817197 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7j8m\" (UniqueName: \"kubernetes.io/projected/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-kube-api-access-k7j8m\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817206 4753 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817215 4753 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-slash\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817224 4753 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817233 4753 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817242 4753 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817251 4753 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817260 4753 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817269 4753 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817280 4753 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-log-socket\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817289 4753 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817297 4753 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817307 4753 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-node-log\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817315 4753 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817323 4753 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817332 4753 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817340 4753 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.817348 4753 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa1e6bd4-ce05-4757-bab2-6addb9d0111e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.918864 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b82e45fc-3c57-447f-96f3-06498b059364-ovnkube-config\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.919171 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-run-netns\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.919287 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-run-ovn\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.919385 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-run-systemd\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.919475 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-run-systemd\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.919302 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-run-netns\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.919332 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-run-ovn\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.919696 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-slash\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.919848 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-run-openvswitch\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.920013 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-run-ovn-kubernetes\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.920187 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq4fw\" (UniqueName: \"kubernetes.io/projected/b82e45fc-3c57-447f-96f3-06498b059364-kube-api-access-bq4fw\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.920613 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-kubelet\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.919764 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-slash\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.920129 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-run-ovn-kubernetes\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.919732 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b82e45fc-3c57-447f-96f3-06498b059364-ovnkube-config\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.920762 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-kubelet\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.919970 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-run-openvswitch\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.921303 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b82e45fc-3c57-447f-96f3-06498b059364-ovnkube-script-lib\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.921430 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-cni-bin\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.921562 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b82e45fc-3c57-447f-96f3-06498b059364-ovnkube-script-lib\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.921551 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-cni-bin\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.921775 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-etc-openvswitch\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.921931 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-cni-netd\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.922080 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-node-log\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.922243 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-var-lib-openvswitch\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.921889 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-etc-openvswitch\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.922507 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-systemd-units\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.922038 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-cni-netd\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.922350 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-var-lib-openvswitch\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.922205 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-node-log\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.922800 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-systemd-units\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.922914 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-log-socket\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.923072 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.923258 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b82e45fc-3c57-447f-96f3-06498b059364-env-overrides\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.923878 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b82e45fc-3c57-447f-96f3-06498b059364-ovn-node-metrics-cert\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.923019 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-log-socket\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.923824 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b82e45fc-3c57-447f-96f3-06498b059364-env-overrides\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.923128 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82e45fc-3c57-447f-96f3-06498b059364-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.926475 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b82e45fc-3c57-447f-96f3-06498b059364-ovn-node-metrics-cert\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:29 crc kubenswrapper[4753]: I1005 20:24:29.938763 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq4fw\" (UniqueName: \"kubernetes.io/projected/b82e45fc-3c57-447f-96f3-06498b059364-kube-api-access-bq4fw\") pod \"ovnkube-node-stmvz\" (UID: \"b82e45fc-3c57-447f-96f3-06498b059364\") " pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.058219 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.135961 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" event={"ID":"b82e45fc-3c57-447f-96f3-06498b059364","Type":"ContainerStarted","Data":"33c0b0a353e703465118edfdfc4e9b43ce478f4c4d49a58eb5236ddb3ee7a83e"} Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.137293 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr5q8_8a6cead6-0872-4b49-a08c-529805f646f2/kube-multus/2.log" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.139701 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovnkube-controller/3.log" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.141776 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovn-acl-logging/0.log" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.142358 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-htbfn_fa1e6bd4-ce05-4757-bab2-6addb9d0111e/ovn-controller/0.log" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.142766 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerID="ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11" exitCode=0 Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.142826 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerID="f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642" exitCode=0 Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.142834 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerID="9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd" exitCode=0 Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.142843 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerID="80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d" exitCode=0 Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.142849 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" containerID="f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4" exitCode=143 Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.142904 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.142886 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerDied","Data":"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11"} Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.143211 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerDied","Data":"f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642"} Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.143242 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerDied","Data":"9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd"} Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.143251 4753 scope.go:117] "RemoveContainer" containerID="ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.143256 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerDied","Data":"80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d"} Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.143274 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerDied","Data":"f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4"} Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.143287 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-htbfn" event={"ID":"fa1e6bd4-ce05-4757-bab2-6addb9d0111e","Type":"ContainerDied","Data":"aae37b94c3f6b9a2fa8bd20b3a8be44fa4f89f3ef663db20e9978ff3f436eb06"} Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.143302 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16"} Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.143317 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642"} Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.143324 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd"} Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.143334 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d"} Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.143341 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58"} Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.143350 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243"} Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.143357 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c"} Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.143364 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4"} Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.143371 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9"} Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.169779 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-htbfn"] Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.169932 4753 scope.go:117] "RemoveContainer" containerID="84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.175525 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-htbfn"] Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.192827 4753 scope.go:117] "RemoveContainer" containerID="f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.265981 4753 scope.go:117] "RemoveContainer" containerID="9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.285064 4753 scope.go:117] "RemoveContainer" containerID="80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.304925 4753 scope.go:117] "RemoveContainer" containerID="c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.339345 4753 scope.go:117] "RemoveContainer" containerID="ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.353206 4753 scope.go:117] "RemoveContainer" containerID="2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.367201 4753 scope.go:117] "RemoveContainer" containerID="f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.385710 4753 scope.go:117] "RemoveContainer" containerID="4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.407299 4753 scope.go:117] "RemoveContainer" containerID="ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11" Oct 05 20:24:30 crc kubenswrapper[4753]: E1005 20:24:30.407761 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11\": container with ID starting with ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11 not found: ID does not exist" containerID="ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.407787 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11"} err="failed to get container status \"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11\": rpc error: code = NotFound desc = could not find container \"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11\": container with ID starting with ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.407809 4753 scope.go:117] "RemoveContainer" containerID="84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16" Oct 05 20:24:30 crc kubenswrapper[4753]: E1005 20:24:30.408296 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\": container with ID starting with 84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16 not found: ID does not exist" containerID="84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.408312 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16"} err="failed to get container status \"84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\": rpc error: code = NotFound desc = could not find container \"84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\": container with ID starting with 84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.408323 4753 scope.go:117] "RemoveContainer" containerID="f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642" Oct 05 20:24:30 crc kubenswrapper[4753]: E1005 20:24:30.408688 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\": container with ID starting with f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642 not found: ID does not exist" containerID="f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.408705 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642"} err="failed to get container status \"f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\": rpc error: code = NotFound desc = could not find container \"f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\": container with ID starting with f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.408730 4753 scope.go:117] "RemoveContainer" containerID="9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd" Oct 05 20:24:30 crc kubenswrapper[4753]: E1005 20:24:30.409017 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\": container with ID starting with 9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd not found: ID does not exist" containerID="9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.409038 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd"} err="failed to get container status \"9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\": rpc error: code = NotFound desc = could not find container \"9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\": container with ID starting with 9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.409052 4753 scope.go:117] "RemoveContainer" containerID="80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d" Oct 05 20:24:30 crc kubenswrapper[4753]: E1005 20:24:30.409371 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\": container with ID starting with 80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d not found: ID does not exist" containerID="80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.409389 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d"} err="failed to get container status \"80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\": rpc error: code = NotFound desc = could not find container \"80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\": container with ID starting with 80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.409413 4753 scope.go:117] "RemoveContainer" containerID="c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58" Oct 05 20:24:30 crc kubenswrapper[4753]: E1005 20:24:30.409701 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\": container with ID starting with c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58 not found: ID does not exist" containerID="c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.409722 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58"} err="failed to get container status \"c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\": rpc error: code = NotFound desc = could not find container \"c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\": container with ID starting with c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.409735 4753 scope.go:117] "RemoveContainer" containerID="ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243" Oct 05 20:24:30 crc kubenswrapper[4753]: E1005 20:24:30.409968 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\": container with ID starting with ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243 not found: ID does not exist" containerID="ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.409988 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243"} err="failed to get container status \"ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\": rpc error: code = NotFound desc = could not find container \"ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\": container with ID starting with ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.410000 4753 scope.go:117] "RemoveContainer" containerID="2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c" Oct 05 20:24:30 crc kubenswrapper[4753]: E1005 20:24:30.410416 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\": container with ID starting with 2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c not found: ID does not exist" containerID="2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.410434 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c"} err="failed to get container status \"2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\": rpc error: code = NotFound desc = could not find container \"2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\": container with ID starting with 2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.410446 4753 scope.go:117] "RemoveContainer" containerID="f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4" Oct 05 20:24:30 crc kubenswrapper[4753]: E1005 20:24:30.410784 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\": container with ID starting with f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4 not found: ID does not exist" containerID="f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.410839 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4"} err="failed to get container status \"f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\": rpc error: code = NotFound desc = could not find container \"f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\": container with ID starting with f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.410873 4753 scope.go:117] "RemoveContainer" containerID="4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9" Oct 05 20:24:30 crc kubenswrapper[4753]: E1005 20:24:30.411234 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\": container with ID starting with 4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9 not found: ID does not exist" containerID="4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.411255 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9"} err="failed to get container status \"4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\": rpc error: code = NotFound desc = could not find container \"4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\": container with ID starting with 4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.411267 4753 scope.go:117] "RemoveContainer" containerID="ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.411583 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11"} err="failed to get container status \"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11\": rpc error: code = NotFound desc = could not find container \"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11\": container with ID starting with ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.411598 4753 scope.go:117] "RemoveContainer" containerID="84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.411914 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16"} err="failed to get container status \"84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\": rpc error: code = NotFound desc = could not find container \"84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\": container with ID starting with 84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.411932 4753 scope.go:117] "RemoveContainer" containerID="f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.412265 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642"} err="failed to get container status \"f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\": rpc error: code = NotFound desc = could not find container \"f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\": container with ID starting with f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.412298 4753 scope.go:117] "RemoveContainer" containerID="9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.412628 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd"} err="failed to get container status \"9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\": rpc error: code = NotFound desc = could not find container \"9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\": container with ID starting with 9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.412646 4753 scope.go:117] "RemoveContainer" containerID="80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.412968 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d"} err="failed to get container status \"80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\": rpc error: code = NotFound desc = could not find container \"80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\": container with ID starting with 80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.412985 4753 scope.go:117] "RemoveContainer" containerID="c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.413273 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58"} err="failed to get container status \"c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\": rpc error: code = NotFound desc = could not find container \"c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\": container with ID starting with c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.413288 4753 scope.go:117] "RemoveContainer" containerID="ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.413522 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243"} err="failed to get container status \"ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\": rpc error: code = NotFound desc = could not find container \"ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\": container with ID starting with ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.413551 4753 scope.go:117] "RemoveContainer" containerID="2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.413890 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c"} err="failed to get container status \"2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\": rpc error: code = NotFound desc = could not find container \"2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\": container with ID starting with 2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.413910 4753 scope.go:117] "RemoveContainer" containerID="f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.414163 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4"} err="failed to get container status \"f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\": rpc error: code = NotFound desc = could not find container \"f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\": container with ID starting with f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.414193 4753 scope.go:117] "RemoveContainer" containerID="4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.414492 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9"} err="failed to get container status \"4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\": rpc error: code = NotFound desc = could not find container \"4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\": container with ID starting with 4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.414510 4753 scope.go:117] "RemoveContainer" containerID="ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.414830 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11"} err="failed to get container status \"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11\": rpc error: code = NotFound desc = could not find container \"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11\": container with ID starting with ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.414847 4753 scope.go:117] "RemoveContainer" containerID="84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.415177 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16"} err="failed to get container status \"84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\": rpc error: code = NotFound desc = could not find container \"84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\": container with ID starting with 84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.415215 4753 scope.go:117] "RemoveContainer" containerID="f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.415539 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642"} err="failed to get container status \"f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\": rpc error: code = NotFound desc = could not find container \"f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\": container with ID starting with f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.415560 4753 scope.go:117] "RemoveContainer" containerID="9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.415792 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd"} err="failed to get container status \"9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\": rpc error: code = NotFound desc = could not find container \"9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\": container with ID starting with 9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.415810 4753 scope.go:117] "RemoveContainer" containerID="80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.416250 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d"} err="failed to get container status \"80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\": rpc error: code = NotFound desc = could not find container \"80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\": container with ID starting with 80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.416267 4753 scope.go:117] "RemoveContainer" containerID="c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.416626 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58"} err="failed to get container status \"c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\": rpc error: code = NotFound desc = could not find container \"c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\": container with ID starting with c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.416641 4753 scope.go:117] "RemoveContainer" containerID="ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.416969 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243"} err="failed to get container status \"ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\": rpc error: code = NotFound desc = could not find container \"ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\": container with ID starting with ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.416984 4753 scope.go:117] "RemoveContainer" containerID="2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.417365 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c"} err="failed to get container status \"2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\": rpc error: code = NotFound desc = could not find container \"2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\": container with ID starting with 2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.417398 4753 scope.go:117] "RemoveContainer" containerID="f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.417777 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4"} err="failed to get container status \"f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\": rpc error: code = NotFound desc = could not find container \"f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\": container with ID starting with f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.417795 4753 scope.go:117] "RemoveContainer" containerID="4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.418117 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9"} err="failed to get container status \"4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\": rpc error: code = NotFound desc = could not find container \"4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\": container with ID starting with 4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.418172 4753 scope.go:117] "RemoveContainer" containerID="ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.418461 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11"} err="failed to get container status \"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11\": rpc error: code = NotFound desc = could not find container \"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11\": container with ID starting with ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.418479 4753 scope.go:117] "RemoveContainer" containerID="84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.418696 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16"} err="failed to get container status \"84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\": rpc error: code = NotFound desc = could not find container \"84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16\": container with ID starting with 84ba7f0e99ee9a9cf6f2f582858d35ec45ce045920556d28391dfb4350622e16 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.418712 4753 scope.go:117] "RemoveContainer" containerID="f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.418993 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642"} err="failed to get container status \"f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\": rpc error: code = NotFound desc = could not find container \"f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642\": container with ID starting with f24f8af6886c32b32becdf0a25446b47b4ec70bfc2b5e20733db533aa9a38642 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.419007 4753 scope.go:117] "RemoveContainer" containerID="9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.419385 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd"} err="failed to get container status \"9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\": rpc error: code = NotFound desc = could not find container \"9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd\": container with ID starting with 9d087132745838f0c6b24e751ca9eb28508dd76765dcb8c9dedbcc00a0b0b6cd not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.419402 4753 scope.go:117] "RemoveContainer" containerID="80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.419699 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d"} err="failed to get container status \"80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\": rpc error: code = NotFound desc = could not find container \"80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d\": container with ID starting with 80c4a5c5fb5935c8aef69890d4df5e1df516b4f54fc99bb78b3492f0f28cb12d not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.419714 4753 scope.go:117] "RemoveContainer" containerID="c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.419923 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58"} err="failed to get container status \"c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\": rpc error: code = NotFound desc = could not find container \"c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58\": container with ID starting with c49294afdbe6a8067726ae221eb834c23c3f994119d2ed9a03d353c5bb971e58 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.419943 4753 scope.go:117] "RemoveContainer" containerID="ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.420132 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243"} err="failed to get container status \"ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\": rpc error: code = NotFound desc = could not find container \"ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243\": container with ID starting with ebedde909d519708a761a8ed6e04aec09f9b688e6ad54e69fd9c50b9e76a1243 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.420159 4753 scope.go:117] "RemoveContainer" containerID="2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.420419 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c"} err="failed to get container status \"2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\": rpc error: code = NotFound desc = could not find container \"2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c\": container with ID starting with 2640b5288321d9ba3134e1683311442c65e1fe6d41cf095199bb8dd7a67f128c not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.420447 4753 scope.go:117] "RemoveContainer" containerID="f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.420710 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4"} err="failed to get container status \"f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\": rpc error: code = NotFound desc = could not find container \"f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4\": container with ID starting with f766b3e2c2d91e2e470ec8a5302e1c6ec0a48eae6634af25541d402388fe8dd4 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.420740 4753 scope.go:117] "RemoveContainer" containerID="4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.421030 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9"} err="failed to get container status \"4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\": rpc error: code = NotFound desc = could not find container \"4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9\": container with ID starting with 4cd910a4278caa7df5240583e6c23e895173c6bc83451a482efbfc59bc8244c9 not found: ID does not exist" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.421055 4753 scope.go:117] "RemoveContainer" containerID="ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11" Oct 05 20:24:30 crc kubenswrapper[4753]: I1005 20:24:30.421315 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11"} err="failed to get container status \"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11\": rpc error: code = NotFound desc = could not find container \"ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11\": container with ID starting with ca300e2b270514641c2670f26bb1b931a79ddef64d88d51f83fc3435ecccbc11 not found: ID does not exist" Oct 05 20:24:31 crc kubenswrapper[4753]: I1005 20:24:31.156421 4753 generic.go:334] "Generic (PLEG): container finished" podID="b82e45fc-3c57-447f-96f3-06498b059364" containerID="2f50a6631c5337d44d10ca29234c5d555c504491ca0a9a62ec93aed58ca2ca23" exitCode=0 Oct 05 20:24:31 crc kubenswrapper[4753]: I1005 20:24:31.156567 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" event={"ID":"b82e45fc-3c57-447f-96f3-06498b059364","Type":"ContainerDied","Data":"2f50a6631c5337d44d10ca29234c5d555c504491ca0a9a62ec93aed58ca2ca23"} Oct 05 20:24:31 crc kubenswrapper[4753]: I1005 20:24:31.859224 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa1e6bd4-ce05-4757-bab2-6addb9d0111e" path="/var/lib/kubelet/pods/fa1e6bd4-ce05-4757-bab2-6addb9d0111e/volumes" Oct 05 20:24:32 crc kubenswrapper[4753]: I1005 20:24:32.173218 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" event={"ID":"b82e45fc-3c57-447f-96f3-06498b059364","Type":"ContainerStarted","Data":"f2a609dcdd51aa296269246f0b44a7c12763bc2e0891df96b2b1e94122901f39"} Oct 05 20:24:32 crc kubenswrapper[4753]: I1005 20:24:32.173293 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" event={"ID":"b82e45fc-3c57-447f-96f3-06498b059364","Type":"ContainerStarted","Data":"0e8e1730729a432e72af898cbb46558fb23196eaebbf031807f7d20d53196eba"} Oct 05 20:24:32 crc kubenswrapper[4753]: I1005 20:24:32.173309 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" event={"ID":"b82e45fc-3c57-447f-96f3-06498b059364","Type":"ContainerStarted","Data":"dfa7eb2a7b2faa85f08a0491f49e6801a82e96c6ef6b5dd4ab263ad4e71c1457"} Oct 05 20:24:32 crc kubenswrapper[4753]: I1005 20:24:32.173321 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" event={"ID":"b82e45fc-3c57-447f-96f3-06498b059364","Type":"ContainerStarted","Data":"254b982e273056d57ae824041816fcebda7a0944a1de2061e089c848018bbf56"} Oct 05 20:24:32 crc kubenswrapper[4753]: I1005 20:24:32.173358 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" event={"ID":"b82e45fc-3c57-447f-96f3-06498b059364","Type":"ContainerStarted","Data":"54ad1cd4cf807b4495c820b971f7c809dc18a8c48fe06e26b509ed79e968ee3f"} Oct 05 20:24:32 crc kubenswrapper[4753]: I1005 20:24:32.173373 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" event={"ID":"b82e45fc-3c57-447f-96f3-06498b059364","Type":"ContainerStarted","Data":"764f1a7b465eb7e832f3300e7f902b007361d5d50c5c5ac62584deb0f435a65b"} Oct 05 20:24:34 crc kubenswrapper[4753]: I1005 20:24:34.187509 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" event={"ID":"b82e45fc-3c57-447f-96f3-06498b059364","Type":"ContainerStarted","Data":"761f8f8beecf65b93b84cdba841837992a952e539327244bd3be091242105e2e"} Oct 05 20:24:34 crc kubenswrapper[4753]: I1005 20:24:34.490241 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:24:34 crc kubenswrapper[4753]: I1005 20:24:34.490614 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:24:34 crc kubenswrapper[4753]: I1005 20:24:34.490659 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:24:34 crc kubenswrapper[4753]: I1005 20:24:34.491218 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ebcd29664463350c05d6256aae98be7654b14c206df5bbe017e8126dff22fad"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 20:24:34 crc kubenswrapper[4753]: I1005 20:24:34.491283 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://5ebcd29664463350c05d6256aae98be7654b14c206df5bbe017e8126dff22fad" gracePeriod=600 Oct 05 20:24:35 crc kubenswrapper[4753]: I1005 20:24:35.196487 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="5ebcd29664463350c05d6256aae98be7654b14c206df5bbe017e8126dff22fad" exitCode=0 Oct 05 20:24:35 crc kubenswrapper[4753]: I1005 20:24:35.196526 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"5ebcd29664463350c05d6256aae98be7654b14c206df5bbe017e8126dff22fad"} Oct 05 20:24:35 crc kubenswrapper[4753]: I1005 20:24:35.196588 4753 scope.go:117] "RemoveContainer" containerID="d036ff9dbc389fc79b9fd8e46b1fb2f4c9ac8b14987017b53068ab8b2e6b56cb" Oct 05 20:24:36 crc kubenswrapper[4753]: I1005 20:24:36.202935 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"52a63543bb254f48756b18656816abfb4df8b41bf216da0e9c35cd4d17058bd4"} Oct 05 20:24:37 crc kubenswrapper[4753]: I1005 20:24:37.211163 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" event={"ID":"b82e45fc-3c57-447f-96f3-06498b059364","Type":"ContainerStarted","Data":"9127b32a56a85563e707de5a9581d7b364b3152b82fdc52933d5c3d4d6ad7f97"} Oct 05 20:24:37 crc kubenswrapper[4753]: I1005 20:24:37.211651 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:37 crc kubenswrapper[4753]: I1005 20:24:37.211663 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:37 crc kubenswrapper[4753]: I1005 20:24:37.211673 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:37 crc kubenswrapper[4753]: I1005 20:24:37.239999 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" podStartSLOduration=8.239957186 podStartE2EDuration="8.239957186s" podCreationTimestamp="2025-10-05 20:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:24:37.238662347 +0000 UTC m=+586.086990579" watchObservedRunningTime="2025-10-05 20:24:37.239957186 +0000 UTC m=+586.088285418" Oct 05 20:24:37 crc kubenswrapper[4753]: I1005 20:24:37.263124 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:37 crc kubenswrapper[4753]: I1005 20:24:37.264533 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:24:41 crc kubenswrapper[4753]: I1005 20:24:41.858662 4753 scope.go:117] "RemoveContainer" containerID="9909c718bf28ac2a716f30feb9dbce15eba410ec607a82b1fa8faf6328b099a0" Oct 05 20:24:41 crc kubenswrapper[4753]: E1005 20:24:41.860633 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zr5q8_openshift-multus(8a6cead6-0872-4b49-a08c-529805f646f2)\"" pod="openshift-multus/multus-zr5q8" podUID="8a6cead6-0872-4b49-a08c-529805f646f2" Oct 05 20:24:52 crc kubenswrapper[4753]: I1005 20:24:52.012194 4753 scope.go:117] "RemoveContainer" containerID="669c16fa7de7657d438af283f119f6b5b44b1eb3aba266d078c34f7b14aedf58" Oct 05 20:24:52 crc kubenswrapper[4753]: I1005 20:24:52.039642 4753 scope.go:117] "RemoveContainer" containerID="ca5e173a18893323ab9fb65d1db5c6bee0355794985374b0b11eaed060bb66bf" Oct 05 20:24:54 crc kubenswrapper[4753]: I1005 20:24:54.852341 4753 scope.go:117] "RemoveContainer" containerID="9909c718bf28ac2a716f30feb9dbce15eba410ec607a82b1fa8faf6328b099a0" Oct 05 20:24:55 crc kubenswrapper[4753]: I1005 20:24:55.336820 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr5q8_8a6cead6-0872-4b49-a08c-529805f646f2/kube-multus/2.log" Oct 05 20:24:55 crc kubenswrapper[4753]: I1005 20:24:55.337239 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr5q8" event={"ID":"8a6cead6-0872-4b49-a08c-529805f646f2","Type":"ContainerStarted","Data":"62c93e2cbf8992398aa07818f17220e9534720206ac57fb2b07af5563bd1accb"} Oct 05 20:25:00 crc kubenswrapper[4753]: I1005 20:25:00.086309 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-stmvz" Oct 05 20:25:10 crc kubenswrapper[4753]: I1005 20:25:10.941819 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892"] Oct 05 20:25:10 crc kubenswrapper[4753]: I1005 20:25:10.943286 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" Oct 05 20:25:10 crc kubenswrapper[4753]: I1005 20:25:10.945475 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 05 20:25:10 crc kubenswrapper[4753]: I1005 20:25:10.955260 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892"] Oct 05 20:25:11 crc kubenswrapper[4753]: I1005 20:25:11.110488 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c210d16-5774-42c1-95d1-bba3c106fa44-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892\" (UID: \"2c210d16-5774-42c1-95d1-bba3c106fa44\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" Oct 05 20:25:11 crc kubenswrapper[4753]: I1005 20:25:11.110558 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdkgb\" (UniqueName: \"kubernetes.io/projected/2c210d16-5774-42c1-95d1-bba3c106fa44-kube-api-access-vdkgb\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892\" (UID: \"2c210d16-5774-42c1-95d1-bba3c106fa44\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" Oct 05 20:25:11 crc kubenswrapper[4753]: I1005 20:25:11.110584 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c210d16-5774-42c1-95d1-bba3c106fa44-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892\" (UID: \"2c210d16-5774-42c1-95d1-bba3c106fa44\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" Oct 05 20:25:11 crc kubenswrapper[4753]: I1005 20:25:11.211900 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c210d16-5774-42c1-95d1-bba3c106fa44-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892\" (UID: \"2c210d16-5774-42c1-95d1-bba3c106fa44\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" Oct 05 20:25:11 crc kubenswrapper[4753]: I1005 20:25:11.211964 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdkgb\" (UniqueName: \"kubernetes.io/projected/2c210d16-5774-42c1-95d1-bba3c106fa44-kube-api-access-vdkgb\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892\" (UID: \"2c210d16-5774-42c1-95d1-bba3c106fa44\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" Oct 05 20:25:11 crc kubenswrapper[4753]: I1005 20:25:11.211991 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c210d16-5774-42c1-95d1-bba3c106fa44-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892\" (UID: \"2c210d16-5774-42c1-95d1-bba3c106fa44\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" Oct 05 20:25:11 crc kubenswrapper[4753]: I1005 20:25:11.212542 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c210d16-5774-42c1-95d1-bba3c106fa44-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892\" (UID: \"2c210d16-5774-42c1-95d1-bba3c106fa44\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" Oct 05 20:25:11 crc kubenswrapper[4753]: I1005 20:25:11.212591 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c210d16-5774-42c1-95d1-bba3c106fa44-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892\" (UID: \"2c210d16-5774-42c1-95d1-bba3c106fa44\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" Oct 05 20:25:11 crc kubenswrapper[4753]: I1005 20:25:11.233298 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdkgb\" (UniqueName: \"kubernetes.io/projected/2c210d16-5774-42c1-95d1-bba3c106fa44-kube-api-access-vdkgb\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892\" (UID: \"2c210d16-5774-42c1-95d1-bba3c106fa44\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" Oct 05 20:25:11 crc kubenswrapper[4753]: I1005 20:25:11.258659 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" Oct 05 20:25:11 crc kubenswrapper[4753]: I1005 20:25:11.493162 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892"] Oct 05 20:25:12 crc kubenswrapper[4753]: I1005 20:25:12.444615 4753 generic.go:334] "Generic (PLEG): container finished" podID="2c210d16-5774-42c1-95d1-bba3c106fa44" containerID="d31dc3287ef976ffba171cf3d37fd339d82814c4c5d12a7db7648b8ed977b51b" exitCode=0 Oct 05 20:25:12 crc kubenswrapper[4753]: I1005 20:25:12.444712 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" event={"ID":"2c210d16-5774-42c1-95d1-bba3c106fa44","Type":"ContainerDied","Data":"d31dc3287ef976ffba171cf3d37fd339d82814c4c5d12a7db7648b8ed977b51b"} Oct 05 20:25:12 crc kubenswrapper[4753]: I1005 20:25:12.444989 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" event={"ID":"2c210d16-5774-42c1-95d1-bba3c106fa44","Type":"ContainerStarted","Data":"f16c8f95d3ea8ae33d18c911157ff609ddf3a4c9df95d7af5d3cef7e57103b88"} Oct 05 20:25:14 crc kubenswrapper[4753]: I1005 20:25:14.469375 4753 generic.go:334] "Generic (PLEG): container finished" podID="2c210d16-5774-42c1-95d1-bba3c106fa44" containerID="e4a85b7576f52474ef5748e7ed11ccf14cd086e454d4b6957ef0f6149ea25d35" exitCode=0 Oct 05 20:25:14 crc kubenswrapper[4753]: I1005 20:25:14.469446 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" event={"ID":"2c210d16-5774-42c1-95d1-bba3c106fa44","Type":"ContainerDied","Data":"e4a85b7576f52474ef5748e7ed11ccf14cd086e454d4b6957ef0f6149ea25d35"} Oct 05 20:25:15 crc kubenswrapper[4753]: I1005 20:25:15.479478 4753 generic.go:334] "Generic (PLEG): container finished" podID="2c210d16-5774-42c1-95d1-bba3c106fa44" containerID="ea3d19b5fd624c0c959be0d80ccb344e17def9a4ba29a00796222d8f30415112" exitCode=0 Oct 05 20:25:15 crc kubenswrapper[4753]: I1005 20:25:15.479548 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" event={"ID":"2c210d16-5774-42c1-95d1-bba3c106fa44","Type":"ContainerDied","Data":"ea3d19b5fd624c0c959be0d80ccb344e17def9a4ba29a00796222d8f30415112"} Oct 05 20:25:16 crc kubenswrapper[4753]: I1005 20:25:16.734863 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" Oct 05 20:25:16 crc kubenswrapper[4753]: I1005 20:25:16.889252 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c210d16-5774-42c1-95d1-bba3c106fa44-bundle\") pod \"2c210d16-5774-42c1-95d1-bba3c106fa44\" (UID: \"2c210d16-5774-42c1-95d1-bba3c106fa44\") " Oct 05 20:25:16 crc kubenswrapper[4753]: I1005 20:25:16.889316 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdkgb\" (UniqueName: \"kubernetes.io/projected/2c210d16-5774-42c1-95d1-bba3c106fa44-kube-api-access-vdkgb\") pod \"2c210d16-5774-42c1-95d1-bba3c106fa44\" (UID: \"2c210d16-5774-42c1-95d1-bba3c106fa44\") " Oct 05 20:25:16 crc kubenswrapper[4753]: I1005 20:25:16.889344 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c210d16-5774-42c1-95d1-bba3c106fa44-util\") pod \"2c210d16-5774-42c1-95d1-bba3c106fa44\" (UID: \"2c210d16-5774-42c1-95d1-bba3c106fa44\") " Oct 05 20:25:16 crc kubenswrapper[4753]: I1005 20:25:16.891278 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c210d16-5774-42c1-95d1-bba3c106fa44-bundle" (OuterVolumeSpecName: "bundle") pod "2c210d16-5774-42c1-95d1-bba3c106fa44" (UID: "2c210d16-5774-42c1-95d1-bba3c106fa44"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:25:16 crc kubenswrapper[4753]: I1005 20:25:16.902446 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c210d16-5774-42c1-95d1-bba3c106fa44-kube-api-access-vdkgb" (OuterVolumeSpecName: "kube-api-access-vdkgb") pod "2c210d16-5774-42c1-95d1-bba3c106fa44" (UID: "2c210d16-5774-42c1-95d1-bba3c106fa44"). InnerVolumeSpecName "kube-api-access-vdkgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:25:16 crc kubenswrapper[4753]: I1005 20:25:16.904016 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c210d16-5774-42c1-95d1-bba3c106fa44-util" (OuterVolumeSpecName: "util") pod "2c210d16-5774-42c1-95d1-bba3c106fa44" (UID: "2c210d16-5774-42c1-95d1-bba3c106fa44"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:25:16 crc kubenswrapper[4753]: I1005 20:25:16.991084 4753 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c210d16-5774-42c1-95d1-bba3c106fa44-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:25:16 crc kubenswrapper[4753]: I1005 20:25:16.991188 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdkgb\" (UniqueName: \"kubernetes.io/projected/2c210d16-5774-42c1-95d1-bba3c106fa44-kube-api-access-vdkgb\") on node \"crc\" DevicePath \"\"" Oct 05 20:25:16 crc kubenswrapper[4753]: I1005 20:25:16.991212 4753 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c210d16-5774-42c1-95d1-bba3c106fa44-util\") on node \"crc\" DevicePath \"\"" Oct 05 20:25:17 crc kubenswrapper[4753]: I1005 20:25:17.492751 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" event={"ID":"2c210d16-5774-42c1-95d1-bba3c106fa44","Type":"ContainerDied","Data":"f16c8f95d3ea8ae33d18c911157ff609ddf3a4c9df95d7af5d3cef7e57103b88"} Oct 05 20:25:17 crc kubenswrapper[4753]: I1005 20:25:17.492820 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f16c8f95d3ea8ae33d18c911157ff609ddf3a4c9df95d7af5d3cef7e57103b88" Oct 05 20:25:17 crc kubenswrapper[4753]: I1005 20:25:17.492917 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892" Oct 05 20:25:22 crc kubenswrapper[4753]: I1005 20:25:22.517069 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-l87q6"] Oct 05 20:25:22 crc kubenswrapper[4753]: E1005 20:25:22.517812 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c210d16-5774-42c1-95d1-bba3c106fa44" containerName="pull" Oct 05 20:25:22 crc kubenswrapper[4753]: I1005 20:25:22.517826 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c210d16-5774-42c1-95d1-bba3c106fa44" containerName="pull" Oct 05 20:25:22 crc kubenswrapper[4753]: E1005 20:25:22.517839 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c210d16-5774-42c1-95d1-bba3c106fa44" containerName="util" Oct 05 20:25:22 crc kubenswrapper[4753]: I1005 20:25:22.517846 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c210d16-5774-42c1-95d1-bba3c106fa44" containerName="util" Oct 05 20:25:22 crc kubenswrapper[4753]: E1005 20:25:22.517868 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c210d16-5774-42c1-95d1-bba3c106fa44" containerName="extract" Oct 05 20:25:22 crc kubenswrapper[4753]: I1005 20:25:22.517874 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c210d16-5774-42c1-95d1-bba3c106fa44" containerName="extract" Oct 05 20:25:22 crc kubenswrapper[4753]: I1005 20:25:22.517992 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c210d16-5774-42c1-95d1-bba3c106fa44" containerName="extract" Oct 05 20:25:22 crc kubenswrapper[4753]: I1005 20:25:22.518420 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-l87q6" Oct 05 20:25:22 crc kubenswrapper[4753]: I1005 20:25:22.520089 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-fqczd" Oct 05 20:25:22 crc kubenswrapper[4753]: I1005 20:25:22.520766 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 05 20:25:22 crc kubenswrapper[4753]: I1005 20:25:22.521859 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 05 20:25:22 crc kubenswrapper[4753]: I1005 20:25:22.536253 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-l87q6"] Oct 05 20:25:22 crc kubenswrapper[4753]: I1005 20:25:22.665713 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlx8f\" (UniqueName: \"kubernetes.io/projected/ae5b861a-0212-4c4c-944b-7d6f3187a5a8-kube-api-access-wlx8f\") pod \"nmstate-operator-858ddd8f98-l87q6\" (UID: \"ae5b861a-0212-4c4c-944b-7d6f3187a5a8\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-l87q6" Oct 05 20:25:22 crc kubenswrapper[4753]: I1005 20:25:22.767270 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlx8f\" (UniqueName: \"kubernetes.io/projected/ae5b861a-0212-4c4c-944b-7d6f3187a5a8-kube-api-access-wlx8f\") pod \"nmstate-operator-858ddd8f98-l87q6\" (UID: \"ae5b861a-0212-4c4c-944b-7d6f3187a5a8\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-l87q6" Oct 05 20:25:22 crc kubenswrapper[4753]: I1005 20:25:22.793382 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlx8f\" (UniqueName: \"kubernetes.io/projected/ae5b861a-0212-4c4c-944b-7d6f3187a5a8-kube-api-access-wlx8f\") pod \"nmstate-operator-858ddd8f98-l87q6\" (UID: \"ae5b861a-0212-4c4c-944b-7d6f3187a5a8\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-l87q6" Oct 05 20:25:22 crc kubenswrapper[4753]: I1005 20:25:22.834895 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-l87q6" Oct 05 20:25:23 crc kubenswrapper[4753]: I1005 20:25:23.083174 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-l87q6"] Oct 05 20:25:23 crc kubenswrapper[4753]: I1005 20:25:23.522001 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-l87q6" event={"ID":"ae5b861a-0212-4c4c-944b-7d6f3187a5a8","Type":"ContainerStarted","Data":"dc8ea6b380d5d92b6268c643f27aa649fc080324f58a2fbaf5768bc5e9c3b36e"} Oct 05 20:25:26 crc kubenswrapper[4753]: I1005 20:25:26.535954 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-l87q6" event={"ID":"ae5b861a-0212-4c4c-944b-7d6f3187a5a8","Type":"ContainerStarted","Data":"afe64b3f1642e2758acc4a2d07c034cd0efb7d8780b3e5f96a837c756a939129"} Oct 05 20:25:26 crc kubenswrapper[4753]: I1005 20:25:26.554276 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-l87q6" podStartSLOduration=1.99801907 podStartE2EDuration="4.554258887s" podCreationTimestamp="2025-10-05 20:25:22 +0000 UTC" firstStartedPulling="2025-10-05 20:25:23.100706207 +0000 UTC m=+631.949034439" lastFinishedPulling="2025-10-05 20:25:25.656946024 +0000 UTC m=+634.505274256" observedRunningTime="2025-10-05 20:25:26.550449964 +0000 UTC m=+635.398778196" watchObservedRunningTime="2025-10-05 20:25:26.554258887 +0000 UTC m=+635.402587119" Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.847548 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-q2rfx"] Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.849017 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-q2rfx" Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.856921 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-q2rfx"] Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.856955 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-k58rx"] Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.857554 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k58rx" Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.863601 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-5mhrc"] Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.864229 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5mhrc" Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.865961 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.866035 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-t4gq4" Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.916527 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxqmk\" (UniqueName: \"kubernetes.io/projected/1b16b6bd-7d95-49c5-a2a8-87b0018e30c7-kube-api-access-kxqmk\") pod \"nmstate-webhook-6cdbc54649-k58rx\" (UID: \"1b16b6bd-7d95-49c5-a2a8-87b0018e30c7\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k58rx" Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.916582 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f5c8b381-2e86-47f5-86da-86db3c2aa511-nmstate-lock\") pod \"nmstate-handler-5mhrc\" (UID: \"f5c8b381-2e86-47f5-86da-86db3c2aa511\") " pod="openshift-nmstate/nmstate-handler-5mhrc" Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.916603 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phdnr\" (UniqueName: \"kubernetes.io/projected/f5c8b381-2e86-47f5-86da-86db3c2aa511-kube-api-access-phdnr\") pod \"nmstate-handler-5mhrc\" (UID: \"f5c8b381-2e86-47f5-86da-86db3c2aa511\") " pod="openshift-nmstate/nmstate-handler-5mhrc" Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.916631 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6ngf\" (UniqueName: \"kubernetes.io/projected/d825ed58-9313-4cf2-a923-53e6d809fa60-kube-api-access-p6ngf\") pod \"nmstate-metrics-fdff9cb8d-q2rfx\" (UID: \"d825ed58-9313-4cf2-a923-53e6d809fa60\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-q2rfx" Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.916679 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1b16b6bd-7d95-49c5-a2a8-87b0018e30c7-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-k58rx\" (UID: \"1b16b6bd-7d95-49c5-a2a8-87b0018e30c7\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k58rx" Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.916698 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f5c8b381-2e86-47f5-86da-86db3c2aa511-dbus-socket\") pod \"nmstate-handler-5mhrc\" (UID: \"f5c8b381-2e86-47f5-86da-86db3c2aa511\") " pod="openshift-nmstate/nmstate-handler-5mhrc" Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.916724 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f5c8b381-2e86-47f5-86da-86db3c2aa511-ovs-socket\") pod \"nmstate-handler-5mhrc\" (UID: \"f5c8b381-2e86-47f5-86da-86db3c2aa511\") " pod="openshift-nmstate/nmstate-handler-5mhrc" Oct 05 20:25:31 crc kubenswrapper[4753]: I1005 20:25:31.919195 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-k58rx"] Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.017518 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f5c8b381-2e86-47f5-86da-86db3c2aa511-ovs-socket\") pod \"nmstate-handler-5mhrc\" (UID: \"f5c8b381-2e86-47f5-86da-86db3c2aa511\") " pod="openshift-nmstate/nmstate-handler-5mhrc" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.017582 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxqmk\" (UniqueName: \"kubernetes.io/projected/1b16b6bd-7d95-49c5-a2a8-87b0018e30c7-kube-api-access-kxqmk\") pod \"nmstate-webhook-6cdbc54649-k58rx\" (UID: \"1b16b6bd-7d95-49c5-a2a8-87b0018e30c7\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k58rx" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.017608 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f5c8b381-2e86-47f5-86da-86db3c2aa511-nmstate-lock\") pod \"nmstate-handler-5mhrc\" (UID: \"f5c8b381-2e86-47f5-86da-86db3c2aa511\") " pod="openshift-nmstate/nmstate-handler-5mhrc" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.017626 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phdnr\" (UniqueName: \"kubernetes.io/projected/f5c8b381-2e86-47f5-86da-86db3c2aa511-kube-api-access-phdnr\") pod \"nmstate-handler-5mhrc\" (UID: \"f5c8b381-2e86-47f5-86da-86db3c2aa511\") " pod="openshift-nmstate/nmstate-handler-5mhrc" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.017651 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ngf\" (UniqueName: \"kubernetes.io/projected/d825ed58-9313-4cf2-a923-53e6d809fa60-kube-api-access-p6ngf\") pod \"nmstate-metrics-fdff9cb8d-q2rfx\" (UID: \"d825ed58-9313-4cf2-a923-53e6d809fa60\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-q2rfx" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.017652 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f5c8b381-2e86-47f5-86da-86db3c2aa511-nmstate-lock\") pod \"nmstate-handler-5mhrc\" (UID: \"f5c8b381-2e86-47f5-86da-86db3c2aa511\") " pod="openshift-nmstate/nmstate-handler-5mhrc" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.017624 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f5c8b381-2e86-47f5-86da-86db3c2aa511-ovs-socket\") pod \"nmstate-handler-5mhrc\" (UID: \"f5c8b381-2e86-47f5-86da-86db3c2aa511\") " pod="openshift-nmstate/nmstate-handler-5mhrc" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.017716 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1b16b6bd-7d95-49c5-a2a8-87b0018e30c7-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-k58rx\" (UID: \"1b16b6bd-7d95-49c5-a2a8-87b0018e30c7\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k58rx" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.017931 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f5c8b381-2e86-47f5-86da-86db3c2aa511-dbus-socket\") pod \"nmstate-handler-5mhrc\" (UID: \"f5c8b381-2e86-47f5-86da-86db3c2aa511\") " pod="openshift-nmstate/nmstate-handler-5mhrc" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.018228 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f5c8b381-2e86-47f5-86da-86db3c2aa511-dbus-socket\") pod \"nmstate-handler-5mhrc\" (UID: \"f5c8b381-2e86-47f5-86da-86db3c2aa511\") " pod="openshift-nmstate/nmstate-handler-5mhrc" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.035870 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1b16b6bd-7d95-49c5-a2a8-87b0018e30c7-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-k58rx\" (UID: \"1b16b6bd-7d95-49c5-a2a8-87b0018e30c7\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k58rx" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.035890 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phdnr\" (UniqueName: \"kubernetes.io/projected/f5c8b381-2e86-47f5-86da-86db3c2aa511-kube-api-access-phdnr\") pod \"nmstate-handler-5mhrc\" (UID: \"f5c8b381-2e86-47f5-86da-86db3c2aa511\") " pod="openshift-nmstate/nmstate-handler-5mhrc" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.043667 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxqmk\" (UniqueName: \"kubernetes.io/projected/1b16b6bd-7d95-49c5-a2a8-87b0018e30c7-kube-api-access-kxqmk\") pod \"nmstate-webhook-6cdbc54649-k58rx\" (UID: \"1b16b6bd-7d95-49c5-a2a8-87b0018e30c7\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k58rx" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.044418 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6ngf\" (UniqueName: \"kubernetes.io/projected/d825ed58-9313-4cf2-a923-53e6d809fa60-kube-api-access-p6ngf\") pod \"nmstate-metrics-fdff9cb8d-q2rfx\" (UID: \"d825ed58-9313-4cf2-a923-53e6d809fa60\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-q2rfx" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.062285 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn"] Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.067836 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.076102 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.076233 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-sjvmh" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.082702 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.092477 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn"] Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.170241 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-q2rfx" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.180911 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k58rx" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.189895 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5mhrc" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.221292 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v5j4\" (UniqueName: \"kubernetes.io/projected/51e12003-95ee-4af9-a340-5928bb9d7ae7-kube-api-access-2v5j4\") pod \"nmstate-console-plugin-6b874cbd85-md9qn\" (UID: \"51e12003-95ee-4af9-a340-5928bb9d7ae7\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.221421 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/51e12003-95ee-4af9-a340-5928bb9d7ae7-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-md9qn\" (UID: \"51e12003-95ee-4af9-a340-5928bb9d7ae7\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.221480 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/51e12003-95ee-4af9-a340-5928bb9d7ae7-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-md9qn\" (UID: \"51e12003-95ee-4af9-a340-5928bb9d7ae7\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.276083 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-69b9d7b4db-hcvn8"] Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.276842 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.298238 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69b9d7b4db-hcvn8"] Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.324958 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-trusted-ca-bundle\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.325031 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-console-config\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.325063 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v5j4\" (UniqueName: \"kubernetes.io/projected/51e12003-95ee-4af9-a340-5928bb9d7ae7-kube-api-access-2v5j4\") pod \"nmstate-console-plugin-6b874cbd85-md9qn\" (UID: \"51e12003-95ee-4af9-a340-5928bb9d7ae7\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.325086 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-console-serving-cert\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.325107 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-service-ca\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.325160 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/51e12003-95ee-4af9-a340-5928bb9d7ae7-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-md9qn\" (UID: \"51e12003-95ee-4af9-a340-5928bb9d7ae7\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.325188 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-oauth-serving-cert\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.325209 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-console-oauth-config\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.325253 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/51e12003-95ee-4af9-a340-5928bb9d7ae7-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-md9qn\" (UID: \"51e12003-95ee-4af9-a340-5928bb9d7ae7\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.325273 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzlqt\" (UniqueName: \"kubernetes.io/projected/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-kube-api-access-gzlqt\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.326402 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/51e12003-95ee-4af9-a340-5928bb9d7ae7-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-md9qn\" (UID: \"51e12003-95ee-4af9-a340-5928bb9d7ae7\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.331668 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/51e12003-95ee-4af9-a340-5928bb9d7ae7-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-md9qn\" (UID: \"51e12003-95ee-4af9-a340-5928bb9d7ae7\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.347760 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v5j4\" (UniqueName: \"kubernetes.io/projected/51e12003-95ee-4af9-a340-5928bb9d7ae7-kube-api-access-2v5j4\") pod \"nmstate-console-plugin-6b874cbd85-md9qn\" (UID: \"51e12003-95ee-4af9-a340-5928bb9d7ae7\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.406986 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.426717 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-oauth-serving-cert\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.426770 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-console-oauth-config\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.426800 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzlqt\" (UniqueName: \"kubernetes.io/projected/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-kube-api-access-gzlqt\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.426832 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-trusted-ca-bundle\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.426863 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-console-config\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.426892 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-console-serving-cert\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.426915 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-service-ca\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.428513 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-service-ca\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.428553 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-console-config\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.428828 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-trusted-ca-bundle\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.428927 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-oauth-serving-cert\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.431546 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-console-serving-cert\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.436267 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-console-oauth-config\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.451070 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzlqt\" (UniqueName: \"kubernetes.io/projected/0ac71e15-9508-4ba4-977b-e9bf8366e4ec-kube-api-access-gzlqt\") pod \"console-69b9d7b4db-hcvn8\" (UID: \"0ac71e15-9508-4ba4-977b-e9bf8366e4ec\") " pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.495519 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-q2rfx"] Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.551458 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-k58rx"] Oct 05 20:25:32 crc kubenswrapper[4753]: W1005 20:25:32.560690 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b16b6bd_7d95_49c5_a2a8_87b0018e30c7.slice/crio-1faf589ea5a1b52609ad66a507bbcbda7623f29ea4ac1a04c9b84f19406eb178 WatchSource:0}: Error finding container 1faf589ea5a1b52609ad66a507bbcbda7623f29ea4ac1a04c9b84f19406eb178: Status 404 returned error can't find the container with id 1faf589ea5a1b52609ad66a507bbcbda7623f29ea4ac1a04c9b84f19406eb178 Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.564944 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5mhrc" event={"ID":"f5c8b381-2e86-47f5-86da-86db3c2aa511","Type":"ContainerStarted","Data":"ec04623acf20cf1f1d54a0261f3d16ce3fc87dff2e61ea8ff9a6c003db8ff8cd"} Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.567701 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-q2rfx" event={"ID":"d825ed58-9313-4cf2-a923-53e6d809fa60","Type":"ContainerStarted","Data":"03385908d511cf67228d3ba49dd8b30a46c41b9bfef3867ef646b6dcf4991411"} Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.601440 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.663680 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn"] Oct 05 20:25:32 crc kubenswrapper[4753]: W1005 20:25:32.669040 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51e12003_95ee_4af9_a340_5928bb9d7ae7.slice/crio-37ded8ba79b554ee5ff968742ce7ccb254576650d32a5c10c8b061911295ed33 WatchSource:0}: Error finding container 37ded8ba79b554ee5ff968742ce7ccb254576650d32a5c10c8b061911295ed33: Status 404 returned error can't find the container with id 37ded8ba79b554ee5ff968742ce7ccb254576650d32a5c10c8b061911295ed33 Oct 05 20:25:32 crc kubenswrapper[4753]: I1005 20:25:32.804804 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69b9d7b4db-hcvn8"] Oct 05 20:25:32 crc kubenswrapper[4753]: W1005 20:25:32.810197 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ac71e15_9508_4ba4_977b_e9bf8366e4ec.slice/crio-5c38f522d544ed3b1727967fd02cee5998803da198e75fb587e038169a78ae52 WatchSource:0}: Error finding container 5c38f522d544ed3b1727967fd02cee5998803da198e75fb587e038169a78ae52: Status 404 returned error can't find the container with id 5c38f522d544ed3b1727967fd02cee5998803da198e75fb587e038169a78ae52 Oct 05 20:25:33 crc kubenswrapper[4753]: I1005 20:25:33.573365 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k58rx" event={"ID":"1b16b6bd-7d95-49c5-a2a8-87b0018e30c7","Type":"ContainerStarted","Data":"1faf589ea5a1b52609ad66a507bbcbda7623f29ea4ac1a04c9b84f19406eb178"} Oct 05 20:25:33 crc kubenswrapper[4753]: I1005 20:25:33.575524 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b9d7b4db-hcvn8" event={"ID":"0ac71e15-9508-4ba4-977b-e9bf8366e4ec","Type":"ContainerStarted","Data":"92a7659a22f90d801e1abbac108ddb29b9328a704e6d326303eb2a7bdc24c1b9"} Oct 05 20:25:33 crc kubenswrapper[4753]: I1005 20:25:33.575902 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b9d7b4db-hcvn8" event={"ID":"0ac71e15-9508-4ba4-977b-e9bf8366e4ec","Type":"ContainerStarted","Data":"5c38f522d544ed3b1727967fd02cee5998803da198e75fb587e038169a78ae52"} Oct 05 20:25:33 crc kubenswrapper[4753]: I1005 20:25:33.576656 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn" event={"ID":"51e12003-95ee-4af9-a340-5928bb9d7ae7","Type":"ContainerStarted","Data":"37ded8ba79b554ee5ff968742ce7ccb254576650d32a5c10c8b061911295ed33"} Oct 05 20:25:33 crc kubenswrapper[4753]: I1005 20:25:33.594712 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69b9d7b4db-hcvn8" podStartSLOduration=1.5946979730000002 podStartE2EDuration="1.594697973s" podCreationTimestamp="2025-10-05 20:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:25:33.591074705 +0000 UTC m=+642.439402947" watchObservedRunningTime="2025-10-05 20:25:33.594697973 +0000 UTC m=+642.443026195" Oct 05 20:25:36 crc kubenswrapper[4753]: I1005 20:25:36.592198 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-q2rfx" event={"ID":"d825ed58-9313-4cf2-a923-53e6d809fa60","Type":"ContainerStarted","Data":"6151698f097bdcb4a0df098271897cc8c95bf007f5638480c81a8b5677a80a9d"} Oct 05 20:25:36 crc kubenswrapper[4753]: I1005 20:25:36.593957 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn" event={"ID":"51e12003-95ee-4af9-a340-5928bb9d7ae7","Type":"ContainerStarted","Data":"fe45c2b0d6b1d2e4d0d5e8ba136b1666b48495afb764468fb3ac77afa0eaa5f9"} Oct 05 20:25:36 crc kubenswrapper[4753]: I1005 20:25:36.595504 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k58rx" event={"ID":"1b16b6bd-7d95-49c5-a2a8-87b0018e30c7","Type":"ContainerStarted","Data":"89cb739c8ad5bb6e359a531d92e54402989867487c691aed761102f368bea6fc"} Oct 05 20:25:36 crc kubenswrapper[4753]: I1005 20:25:36.595968 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k58rx" Oct 05 20:25:36 crc kubenswrapper[4753]: I1005 20:25:36.597558 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5mhrc" event={"ID":"f5c8b381-2e86-47f5-86da-86db3c2aa511","Type":"ContainerStarted","Data":"92c524418291828a2b508c7bc4122cbd92be3ec22edbbf4ecb14f945fcaab2f7"} Oct 05 20:25:36 crc kubenswrapper[4753]: I1005 20:25:36.597897 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-5mhrc" Oct 05 20:25:36 crc kubenswrapper[4753]: I1005 20:25:36.605840 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-md9qn" podStartSLOduration=1.6068992880000001 podStartE2EDuration="4.605826332s" podCreationTimestamp="2025-10-05 20:25:32 +0000 UTC" firstStartedPulling="2025-10-05 20:25:32.67184236 +0000 UTC m=+641.520170592" lastFinishedPulling="2025-10-05 20:25:35.670769404 +0000 UTC m=+644.519097636" observedRunningTime="2025-10-05 20:25:36.604516082 +0000 UTC m=+645.452844314" watchObservedRunningTime="2025-10-05 20:25:36.605826332 +0000 UTC m=+645.454154564" Oct 05 20:25:36 crc kubenswrapper[4753]: I1005 20:25:36.628037 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-5mhrc" podStartSLOduration=2.204769615 podStartE2EDuration="5.62801968s" podCreationTimestamp="2025-10-05 20:25:31 +0000 UTC" firstStartedPulling="2025-10-05 20:25:32.252224375 +0000 UTC m=+641.100552607" lastFinishedPulling="2025-10-05 20:25:35.67547444 +0000 UTC m=+644.523802672" observedRunningTime="2025-10-05 20:25:36.624187141 +0000 UTC m=+645.472515373" watchObservedRunningTime="2025-10-05 20:25:36.62801968 +0000 UTC m=+645.476347912" Oct 05 20:25:36 crc kubenswrapper[4753]: I1005 20:25:36.648719 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k58rx" podStartSLOduration=2.53688864 podStartE2EDuration="5.648703511s" podCreationTimestamp="2025-10-05 20:25:31 +0000 UTC" firstStartedPulling="2025-10-05 20:25:32.564853935 +0000 UTC m=+641.413182167" lastFinishedPulling="2025-10-05 20:25:35.676668806 +0000 UTC m=+644.524997038" observedRunningTime="2025-10-05 20:25:36.648678451 +0000 UTC m=+645.497006693" watchObservedRunningTime="2025-10-05 20:25:36.648703511 +0000 UTC m=+645.497031743" Oct 05 20:25:38 crc kubenswrapper[4753]: I1005 20:25:38.623731 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-q2rfx" event={"ID":"d825ed58-9313-4cf2-a923-53e6d809fa60","Type":"ContainerStarted","Data":"a17db81e778a4f3eec269b700bb8e40111cc68f2b4a5ad5a6e62fcabb1cb7975"} Oct 05 20:25:38 crc kubenswrapper[4753]: I1005 20:25:38.643674 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-q2rfx" podStartSLOduration=2.044946042 podStartE2EDuration="7.643652316s" podCreationTimestamp="2025-10-05 20:25:31 +0000 UTC" firstStartedPulling="2025-10-05 20:25:32.51131427 +0000 UTC m=+641.359642502" lastFinishedPulling="2025-10-05 20:25:38.110020544 +0000 UTC m=+646.958348776" observedRunningTime="2025-10-05 20:25:38.637605799 +0000 UTC m=+647.485934051" watchObservedRunningTime="2025-10-05 20:25:38.643652316 +0000 UTC m=+647.491980558" Oct 05 20:25:42 crc kubenswrapper[4753]: I1005 20:25:42.221560 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-5mhrc" Oct 05 20:25:42 crc kubenswrapper[4753]: I1005 20:25:42.602523 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:42 crc kubenswrapper[4753]: I1005 20:25:42.602646 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:42 crc kubenswrapper[4753]: I1005 20:25:42.607194 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:42 crc kubenswrapper[4753]: I1005 20:25:42.652223 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69b9d7b4db-hcvn8" Oct 05 20:25:42 crc kubenswrapper[4753]: I1005 20:25:42.702877 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7klvp"] Oct 05 20:25:52 crc kubenswrapper[4753]: I1005 20:25:52.188352 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k58rx" Oct 05 20:26:06 crc kubenswrapper[4753]: I1005 20:26:06.320102 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88"] Oct 05 20:26:06 crc kubenswrapper[4753]: I1005 20:26:06.321639 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" Oct 05 20:26:06 crc kubenswrapper[4753]: I1005 20:26:06.323534 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 05 20:26:06 crc kubenswrapper[4753]: I1005 20:26:06.333049 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88"] Oct 05 20:26:06 crc kubenswrapper[4753]: I1005 20:26:06.391009 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv9sv\" (UniqueName: \"kubernetes.io/projected/e668534c-ad83-4c7d-9270-bffa57782d91-kube-api-access-mv9sv\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88\" (UID: \"e668534c-ad83-4c7d-9270-bffa57782d91\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" Oct 05 20:26:06 crc kubenswrapper[4753]: I1005 20:26:06.391068 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e668534c-ad83-4c7d-9270-bffa57782d91-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88\" (UID: \"e668534c-ad83-4c7d-9270-bffa57782d91\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" Oct 05 20:26:06 crc kubenswrapper[4753]: I1005 20:26:06.391103 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e668534c-ad83-4c7d-9270-bffa57782d91-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88\" (UID: \"e668534c-ad83-4c7d-9270-bffa57782d91\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" Oct 05 20:26:06 crc kubenswrapper[4753]: I1005 20:26:06.492197 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e668534c-ad83-4c7d-9270-bffa57782d91-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88\" (UID: \"e668534c-ad83-4c7d-9270-bffa57782d91\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" Oct 05 20:26:06 crc kubenswrapper[4753]: I1005 20:26:06.492271 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e668534c-ad83-4c7d-9270-bffa57782d91-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88\" (UID: \"e668534c-ad83-4c7d-9270-bffa57782d91\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" Oct 05 20:26:06 crc kubenswrapper[4753]: I1005 20:26:06.492573 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e668534c-ad83-4c7d-9270-bffa57782d91-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88\" (UID: \"e668534c-ad83-4c7d-9270-bffa57782d91\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" Oct 05 20:26:06 crc kubenswrapper[4753]: I1005 20:26:06.492596 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e668534c-ad83-4c7d-9270-bffa57782d91-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88\" (UID: \"e668534c-ad83-4c7d-9270-bffa57782d91\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" Oct 05 20:26:06 crc kubenswrapper[4753]: I1005 20:26:06.492664 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv9sv\" (UniqueName: \"kubernetes.io/projected/e668534c-ad83-4c7d-9270-bffa57782d91-kube-api-access-mv9sv\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88\" (UID: \"e668534c-ad83-4c7d-9270-bffa57782d91\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" Oct 05 20:26:06 crc kubenswrapper[4753]: I1005 20:26:06.516446 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv9sv\" (UniqueName: \"kubernetes.io/projected/e668534c-ad83-4c7d-9270-bffa57782d91-kube-api-access-mv9sv\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88\" (UID: \"e668534c-ad83-4c7d-9270-bffa57782d91\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" Oct 05 20:26:06 crc kubenswrapper[4753]: I1005 20:26:06.640463 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" Oct 05 20:26:07 crc kubenswrapper[4753]: I1005 20:26:07.050084 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88"] Oct 05 20:26:07 crc kubenswrapper[4753]: W1005 20:26:07.062323 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode668534c_ad83_4c7d_9270_bffa57782d91.slice/crio-213083fa2bd2c55579b0d7e5ee93f638679a860cbfc301ddecf14c2c36dc2c28 WatchSource:0}: Error finding container 213083fa2bd2c55579b0d7e5ee93f638679a860cbfc301ddecf14c2c36dc2c28: Status 404 returned error can't find the container with id 213083fa2bd2c55579b0d7e5ee93f638679a860cbfc301ddecf14c2c36dc2c28 Oct 05 20:26:07 crc kubenswrapper[4753]: I1005 20:26:07.757930 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-7klvp" podUID="eb329af5-99e8-42d9-b79e-4c9acd09204d" containerName="console" containerID="cri-o://86a62bfa2597b4fc213a91e7905006bd9f7f4d3472c15cdd88f2a08ef9d9fc7e" gracePeriod=15 Oct 05 20:26:07 crc kubenswrapper[4753]: I1005 20:26:07.823852 4753 generic.go:334] "Generic (PLEG): container finished" podID="e668534c-ad83-4c7d-9270-bffa57782d91" containerID="29a386d8bdb3341a2ebd3d9ab65f8b95ac5d52735ed3b71668013a62005b3326" exitCode=0 Oct 05 20:26:07 crc kubenswrapper[4753]: I1005 20:26:07.823931 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" event={"ID":"e668534c-ad83-4c7d-9270-bffa57782d91","Type":"ContainerDied","Data":"29a386d8bdb3341a2ebd3d9ab65f8b95ac5d52735ed3b71668013a62005b3326"} Oct 05 20:26:07 crc kubenswrapper[4753]: I1005 20:26:07.824004 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" event={"ID":"e668534c-ad83-4c7d-9270-bffa57782d91","Type":"ContainerStarted","Data":"213083fa2bd2c55579b0d7e5ee93f638679a860cbfc301ddecf14c2c36dc2c28"} Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.102519 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7klvp_eb329af5-99e8-42d9-b79e-4c9acd09204d/console/0.log" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.102590 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.112407 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvgnv\" (UniqueName: \"kubernetes.io/projected/eb329af5-99e8-42d9-b79e-4c9acd09204d-kube-api-access-rvgnv\") pod \"eb329af5-99e8-42d9-b79e-4c9acd09204d\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.112465 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-oauth-config\") pod \"eb329af5-99e8-42d9-b79e-4c9acd09204d\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.112492 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-oauth-serving-cert\") pod \"eb329af5-99e8-42d9-b79e-4c9acd09204d\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.112511 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-service-ca\") pod \"eb329af5-99e8-42d9-b79e-4c9acd09204d\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.112544 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-trusted-ca-bundle\") pod \"eb329af5-99e8-42d9-b79e-4c9acd09204d\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.112563 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-config\") pod \"eb329af5-99e8-42d9-b79e-4c9acd09204d\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.112579 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-serving-cert\") pod \"eb329af5-99e8-42d9-b79e-4c9acd09204d\" (UID: \"eb329af5-99e8-42d9-b79e-4c9acd09204d\") " Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.113730 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "eb329af5-99e8-42d9-b79e-4c9acd09204d" (UID: "eb329af5-99e8-42d9-b79e-4c9acd09204d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.116292 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-config" (OuterVolumeSpecName: "console-config") pod "eb329af5-99e8-42d9-b79e-4c9acd09204d" (UID: "eb329af5-99e8-42d9-b79e-4c9acd09204d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.116472 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-service-ca" (OuterVolumeSpecName: "service-ca") pod "eb329af5-99e8-42d9-b79e-4c9acd09204d" (UID: "eb329af5-99e8-42d9-b79e-4c9acd09204d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.116063 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "eb329af5-99e8-42d9-b79e-4c9acd09204d" (UID: "eb329af5-99e8-42d9-b79e-4c9acd09204d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.124369 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb329af5-99e8-42d9-b79e-4c9acd09204d-kube-api-access-rvgnv" (OuterVolumeSpecName: "kube-api-access-rvgnv") pod "eb329af5-99e8-42d9-b79e-4c9acd09204d" (UID: "eb329af5-99e8-42d9-b79e-4c9acd09204d"). InnerVolumeSpecName "kube-api-access-rvgnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.126027 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "eb329af5-99e8-42d9-b79e-4c9acd09204d" (UID: "eb329af5-99e8-42d9-b79e-4c9acd09204d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.126361 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "eb329af5-99e8-42d9-b79e-4c9acd09204d" (UID: "eb329af5-99e8-42d9-b79e-4c9acd09204d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.214057 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvgnv\" (UniqueName: \"kubernetes.io/projected/eb329af5-99e8-42d9-b79e-4c9acd09204d-kube-api-access-rvgnv\") on node \"crc\" DevicePath \"\"" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.214103 4753 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.214117 4753 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.214130 4753 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-service-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.214164 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.214176 4753 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.214188 4753 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb329af5-99e8-42d9-b79e-4c9acd09204d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.831128 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7klvp_eb329af5-99e8-42d9-b79e-4c9acd09204d/console/0.log" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.831246 4753 generic.go:334] "Generic (PLEG): container finished" podID="eb329af5-99e8-42d9-b79e-4c9acd09204d" containerID="86a62bfa2597b4fc213a91e7905006bd9f7f4d3472c15cdd88f2a08ef9d9fc7e" exitCode=2 Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.831278 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7klvp" event={"ID":"eb329af5-99e8-42d9-b79e-4c9acd09204d","Type":"ContainerDied","Data":"86a62bfa2597b4fc213a91e7905006bd9f7f4d3472c15cdd88f2a08ef9d9fc7e"} Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.831316 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7klvp" event={"ID":"eb329af5-99e8-42d9-b79e-4c9acd09204d","Type":"ContainerDied","Data":"82648570a585e6023562d0debdcb0b25fe43c56a25c25e260f29d09543f0a6c0"} Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.831331 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7klvp" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.831339 4753 scope.go:117] "RemoveContainer" containerID="86a62bfa2597b4fc213a91e7905006bd9f7f4d3472c15cdd88f2a08ef9d9fc7e" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.848033 4753 scope.go:117] "RemoveContainer" containerID="86a62bfa2597b4fc213a91e7905006bd9f7f4d3472c15cdd88f2a08ef9d9fc7e" Oct 05 20:26:08 crc kubenswrapper[4753]: E1005 20:26:08.848447 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86a62bfa2597b4fc213a91e7905006bd9f7f4d3472c15cdd88f2a08ef9d9fc7e\": container with ID starting with 86a62bfa2597b4fc213a91e7905006bd9f7f4d3472c15cdd88f2a08ef9d9fc7e not found: ID does not exist" containerID="86a62bfa2597b4fc213a91e7905006bd9f7f4d3472c15cdd88f2a08ef9d9fc7e" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.848479 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86a62bfa2597b4fc213a91e7905006bd9f7f4d3472c15cdd88f2a08ef9d9fc7e"} err="failed to get container status \"86a62bfa2597b4fc213a91e7905006bd9f7f4d3472c15cdd88f2a08ef9d9fc7e\": rpc error: code = NotFound desc = could not find container \"86a62bfa2597b4fc213a91e7905006bd9f7f4d3472c15cdd88f2a08ef9d9fc7e\": container with ID starting with 86a62bfa2597b4fc213a91e7905006bd9f7f4d3472c15cdd88f2a08ef9d9fc7e not found: ID does not exist" Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.867405 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7klvp"] Oct 05 20:26:08 crc kubenswrapper[4753]: I1005 20:26:08.871333 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-7klvp"] Oct 05 20:26:09 crc kubenswrapper[4753]: I1005 20:26:09.838066 4753 generic.go:334] "Generic (PLEG): container finished" podID="e668534c-ad83-4c7d-9270-bffa57782d91" containerID="836be08aeebee4792efa10414aa488ce7039bf50a9b6671265af3be08ccb7c5e" exitCode=0 Oct 05 20:26:09 crc kubenswrapper[4753]: I1005 20:26:09.838180 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" event={"ID":"e668534c-ad83-4c7d-9270-bffa57782d91","Type":"ContainerDied","Data":"836be08aeebee4792efa10414aa488ce7039bf50a9b6671265af3be08ccb7c5e"} Oct 05 20:26:09 crc kubenswrapper[4753]: I1005 20:26:09.862961 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb329af5-99e8-42d9-b79e-4c9acd09204d" path="/var/lib/kubelet/pods/eb329af5-99e8-42d9-b79e-4c9acd09204d/volumes" Oct 05 20:26:10 crc kubenswrapper[4753]: I1005 20:26:10.850712 4753 generic.go:334] "Generic (PLEG): container finished" podID="e668534c-ad83-4c7d-9270-bffa57782d91" containerID="93e583d4409b10d66a6127d14c146b04c76eaf4b314b3fb20a695fcfff9d8eb3" exitCode=0 Oct 05 20:26:10 crc kubenswrapper[4753]: I1005 20:26:10.850811 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" event={"ID":"e668534c-ad83-4c7d-9270-bffa57782d91","Type":"ContainerDied","Data":"93e583d4409b10d66a6127d14c146b04c76eaf4b314b3fb20a695fcfff9d8eb3"} Oct 05 20:26:12 crc kubenswrapper[4753]: I1005 20:26:12.166482 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" Oct 05 20:26:12 crc kubenswrapper[4753]: I1005 20:26:12.261725 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e668534c-ad83-4c7d-9270-bffa57782d91-bundle\") pod \"e668534c-ad83-4c7d-9270-bffa57782d91\" (UID: \"e668534c-ad83-4c7d-9270-bffa57782d91\") " Oct 05 20:26:12 crc kubenswrapper[4753]: I1005 20:26:12.261899 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e668534c-ad83-4c7d-9270-bffa57782d91-util\") pod \"e668534c-ad83-4c7d-9270-bffa57782d91\" (UID: \"e668534c-ad83-4c7d-9270-bffa57782d91\") " Oct 05 20:26:12 crc kubenswrapper[4753]: I1005 20:26:12.263013 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e668534c-ad83-4c7d-9270-bffa57782d91-bundle" (OuterVolumeSpecName: "bundle") pod "e668534c-ad83-4c7d-9270-bffa57782d91" (UID: "e668534c-ad83-4c7d-9270-bffa57782d91"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:26:12 crc kubenswrapper[4753]: I1005 20:26:12.263490 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv9sv\" (UniqueName: \"kubernetes.io/projected/e668534c-ad83-4c7d-9270-bffa57782d91-kube-api-access-mv9sv\") pod \"e668534c-ad83-4c7d-9270-bffa57782d91\" (UID: \"e668534c-ad83-4c7d-9270-bffa57782d91\") " Oct 05 20:26:12 crc kubenswrapper[4753]: I1005 20:26:12.264313 4753 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e668534c-ad83-4c7d-9270-bffa57782d91-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:26:12 crc kubenswrapper[4753]: I1005 20:26:12.280504 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e668534c-ad83-4c7d-9270-bffa57782d91-kube-api-access-mv9sv" (OuterVolumeSpecName: "kube-api-access-mv9sv") pod "e668534c-ad83-4c7d-9270-bffa57782d91" (UID: "e668534c-ad83-4c7d-9270-bffa57782d91"). InnerVolumeSpecName "kube-api-access-mv9sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:26:12 crc kubenswrapper[4753]: I1005 20:26:12.295659 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e668534c-ad83-4c7d-9270-bffa57782d91-util" (OuterVolumeSpecName: "util") pod "e668534c-ad83-4c7d-9270-bffa57782d91" (UID: "e668534c-ad83-4c7d-9270-bffa57782d91"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:26:12 crc kubenswrapper[4753]: I1005 20:26:12.365977 4753 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e668534c-ad83-4c7d-9270-bffa57782d91-util\") on node \"crc\" DevicePath \"\"" Oct 05 20:26:12 crc kubenswrapper[4753]: I1005 20:26:12.366037 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv9sv\" (UniqueName: \"kubernetes.io/projected/e668534c-ad83-4c7d-9270-bffa57782d91-kube-api-access-mv9sv\") on node \"crc\" DevicePath \"\"" Oct 05 20:26:12 crc kubenswrapper[4753]: I1005 20:26:12.884930 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" event={"ID":"e668534c-ad83-4c7d-9270-bffa57782d91","Type":"ContainerDied","Data":"213083fa2bd2c55579b0d7e5ee93f638679a860cbfc301ddecf14c2c36dc2c28"} Oct 05 20:26:12 crc kubenswrapper[4753]: I1005 20:26:12.885027 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="213083fa2bd2c55579b0d7e5ee93f638679a860cbfc301ddecf14c2c36dc2c28" Oct 05 20:26:12 crc kubenswrapper[4753]: I1005 20:26:12.885091 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.047937 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-76575689f9-tr955"] Oct 05 20:26:23 crc kubenswrapper[4753]: E1005 20:26:23.048745 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e668534c-ad83-4c7d-9270-bffa57782d91" containerName="pull" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.048759 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e668534c-ad83-4c7d-9270-bffa57782d91" containerName="pull" Oct 05 20:26:23 crc kubenswrapper[4753]: E1005 20:26:23.048778 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e668534c-ad83-4c7d-9270-bffa57782d91" containerName="extract" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.048788 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e668534c-ad83-4c7d-9270-bffa57782d91" containerName="extract" Oct 05 20:26:23 crc kubenswrapper[4753]: E1005 20:26:23.048809 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb329af5-99e8-42d9-b79e-4c9acd09204d" containerName="console" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.048818 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb329af5-99e8-42d9-b79e-4c9acd09204d" containerName="console" Oct 05 20:26:23 crc kubenswrapper[4753]: E1005 20:26:23.048832 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e668534c-ad83-4c7d-9270-bffa57782d91" containerName="util" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.048839 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e668534c-ad83-4c7d-9270-bffa57782d91" containerName="util" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.048954 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="e668534c-ad83-4c7d-9270-bffa57782d91" containerName="extract" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.048973 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb329af5-99e8-42d9-b79e-4c9acd09204d" containerName="console" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.049491 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-76575689f9-tr955" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.052449 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.052554 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.053416 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.053576 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-dmqmb" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.054684 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.073057 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-76575689f9-tr955"] Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.206104 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e9003581-3277-433c-9c49-5a186f493cc5-webhook-cert\") pod \"metallb-operator-controller-manager-76575689f9-tr955\" (UID: \"e9003581-3277-433c-9c49-5a186f493cc5\") " pod="metallb-system/metallb-operator-controller-manager-76575689f9-tr955" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.206169 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e9003581-3277-433c-9c49-5a186f493cc5-apiservice-cert\") pod \"metallb-operator-controller-manager-76575689f9-tr955\" (UID: \"e9003581-3277-433c-9c49-5a186f493cc5\") " pod="metallb-system/metallb-operator-controller-manager-76575689f9-tr955" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.206200 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmgv5\" (UniqueName: \"kubernetes.io/projected/e9003581-3277-433c-9c49-5a186f493cc5-kube-api-access-pmgv5\") pod \"metallb-operator-controller-manager-76575689f9-tr955\" (UID: \"e9003581-3277-433c-9c49-5a186f493cc5\") " pod="metallb-system/metallb-operator-controller-manager-76575689f9-tr955" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.286668 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt"] Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.287636 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.290278 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-zn8zf" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.290910 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.291009 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.307523 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e9003581-3277-433c-9c49-5a186f493cc5-webhook-cert\") pod \"metallb-operator-controller-manager-76575689f9-tr955\" (UID: \"e9003581-3277-433c-9c49-5a186f493cc5\") " pod="metallb-system/metallb-operator-controller-manager-76575689f9-tr955" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.307567 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e9003581-3277-433c-9c49-5a186f493cc5-apiservice-cert\") pod \"metallb-operator-controller-manager-76575689f9-tr955\" (UID: \"e9003581-3277-433c-9c49-5a186f493cc5\") " pod="metallb-system/metallb-operator-controller-manager-76575689f9-tr955" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.307596 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmgv5\" (UniqueName: \"kubernetes.io/projected/e9003581-3277-433c-9c49-5a186f493cc5-kube-api-access-pmgv5\") pod \"metallb-operator-controller-manager-76575689f9-tr955\" (UID: \"e9003581-3277-433c-9c49-5a186f493cc5\") " pod="metallb-system/metallb-operator-controller-manager-76575689f9-tr955" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.314121 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e9003581-3277-433c-9c49-5a186f493cc5-apiservice-cert\") pod \"metallb-operator-controller-manager-76575689f9-tr955\" (UID: \"e9003581-3277-433c-9c49-5a186f493cc5\") " pod="metallb-system/metallb-operator-controller-manager-76575689f9-tr955" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.314676 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e9003581-3277-433c-9c49-5a186f493cc5-webhook-cert\") pod \"metallb-operator-controller-manager-76575689f9-tr955\" (UID: \"e9003581-3277-433c-9c49-5a186f493cc5\") " pod="metallb-system/metallb-operator-controller-manager-76575689f9-tr955" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.319548 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt"] Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.336392 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmgv5\" (UniqueName: \"kubernetes.io/projected/e9003581-3277-433c-9c49-5a186f493cc5-kube-api-access-pmgv5\") pod \"metallb-operator-controller-manager-76575689f9-tr955\" (UID: \"e9003581-3277-433c-9c49-5a186f493cc5\") " pod="metallb-system/metallb-operator-controller-manager-76575689f9-tr955" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.367311 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-76575689f9-tr955" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.415413 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2849418e-7428-46b6-89b0-fc001cb09db2-apiservice-cert\") pod \"metallb-operator-webhook-server-78c6d655f5-9pcgt\" (UID: \"2849418e-7428-46b6-89b0-fc001cb09db2\") " pod="metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.415928 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2849418e-7428-46b6-89b0-fc001cb09db2-webhook-cert\") pod \"metallb-operator-webhook-server-78c6d655f5-9pcgt\" (UID: \"2849418e-7428-46b6-89b0-fc001cb09db2\") " pod="metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.416012 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crnq7\" (UniqueName: \"kubernetes.io/projected/2849418e-7428-46b6-89b0-fc001cb09db2-kube-api-access-crnq7\") pod \"metallb-operator-webhook-server-78c6d655f5-9pcgt\" (UID: \"2849418e-7428-46b6-89b0-fc001cb09db2\") " pod="metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.517798 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crnq7\" (UniqueName: \"kubernetes.io/projected/2849418e-7428-46b6-89b0-fc001cb09db2-kube-api-access-crnq7\") pod \"metallb-operator-webhook-server-78c6d655f5-9pcgt\" (UID: \"2849418e-7428-46b6-89b0-fc001cb09db2\") " pod="metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.517905 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2849418e-7428-46b6-89b0-fc001cb09db2-apiservice-cert\") pod \"metallb-operator-webhook-server-78c6d655f5-9pcgt\" (UID: \"2849418e-7428-46b6-89b0-fc001cb09db2\") " pod="metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.517925 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2849418e-7428-46b6-89b0-fc001cb09db2-webhook-cert\") pod \"metallb-operator-webhook-server-78c6d655f5-9pcgt\" (UID: \"2849418e-7428-46b6-89b0-fc001cb09db2\") " pod="metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.525085 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2849418e-7428-46b6-89b0-fc001cb09db2-apiservice-cert\") pod \"metallb-operator-webhook-server-78c6d655f5-9pcgt\" (UID: \"2849418e-7428-46b6-89b0-fc001cb09db2\") " pod="metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.525567 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2849418e-7428-46b6-89b0-fc001cb09db2-webhook-cert\") pod \"metallb-operator-webhook-server-78c6d655f5-9pcgt\" (UID: \"2849418e-7428-46b6-89b0-fc001cb09db2\") " pod="metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.561636 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crnq7\" (UniqueName: \"kubernetes.io/projected/2849418e-7428-46b6-89b0-fc001cb09db2-kube-api-access-crnq7\") pod \"metallb-operator-webhook-server-78c6d655f5-9pcgt\" (UID: \"2849418e-7428-46b6-89b0-fc001cb09db2\") " pod="metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.602479 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt" Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.775944 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-76575689f9-tr955"] Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.882237 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt"] Oct 05 20:26:23 crc kubenswrapper[4753]: W1005 20:26:23.892852 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2849418e_7428_46b6_89b0_fc001cb09db2.slice/crio-ee08b5e06433c6f14209cbc13d9877323e93a68cc6991d4c600d62b2e117e36d WatchSource:0}: Error finding container ee08b5e06433c6f14209cbc13d9877323e93a68cc6991d4c600d62b2e117e36d: Status 404 returned error can't find the container with id ee08b5e06433c6f14209cbc13d9877323e93a68cc6991d4c600d62b2e117e36d Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.957133 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt" event={"ID":"2849418e-7428-46b6-89b0-fc001cb09db2","Type":"ContainerStarted","Data":"ee08b5e06433c6f14209cbc13d9877323e93a68cc6991d4c600d62b2e117e36d"} Oct 05 20:26:23 crc kubenswrapper[4753]: I1005 20:26:23.958126 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-76575689f9-tr955" event={"ID":"e9003581-3277-433c-9c49-5a186f493cc5","Type":"ContainerStarted","Data":"524f8da06620951369a2e12a0bf6028f4a8a4e449fa19e1cdac2975c0df60c8d"} Oct 05 20:26:26 crc kubenswrapper[4753]: I1005 20:26:26.975129 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-76575689f9-tr955" event={"ID":"e9003581-3277-433c-9c49-5a186f493cc5","Type":"ContainerStarted","Data":"2a1af5d372a332561947d9867bfc62530e4c2795e4d3de14c047898cc746fa86"} Oct 05 20:26:26 crc kubenswrapper[4753]: I1005 20:26:26.975533 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-76575689f9-tr955" Oct 05 20:26:27 crc kubenswrapper[4753]: I1005 20:26:27.001097 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-76575689f9-tr955" podStartSLOduration=1.024649313 podStartE2EDuration="4.001079518s" podCreationTimestamp="2025-10-05 20:26:23 +0000 UTC" firstStartedPulling="2025-10-05 20:26:23.78496035 +0000 UTC m=+692.633288582" lastFinishedPulling="2025-10-05 20:26:26.761390555 +0000 UTC m=+695.609718787" observedRunningTime="2025-10-05 20:26:26.99632011 +0000 UTC m=+695.844648342" watchObservedRunningTime="2025-10-05 20:26:27.001079518 +0000 UTC m=+695.849407750" Oct 05 20:26:29 crc kubenswrapper[4753]: I1005 20:26:29.995235 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt" event={"ID":"2849418e-7428-46b6-89b0-fc001cb09db2","Type":"ContainerStarted","Data":"f1840a3327b714b0464599998428d00ad6d10d07113688fc0280bc5ee7ef5eee"} Oct 05 20:26:29 crc kubenswrapper[4753]: I1005 20:26:29.995822 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt" Oct 05 20:26:30 crc kubenswrapper[4753]: I1005 20:26:30.017491 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt" podStartSLOduration=1.7691231950000001 podStartE2EDuration="7.017473711s" podCreationTimestamp="2025-10-05 20:26:23 +0000 UTC" firstStartedPulling="2025-10-05 20:26:23.895270046 +0000 UTC m=+692.743598288" lastFinishedPulling="2025-10-05 20:26:29.143620572 +0000 UTC m=+697.991948804" observedRunningTime="2025-10-05 20:26:30.012500527 +0000 UTC m=+698.860828759" watchObservedRunningTime="2025-10-05 20:26:30.017473711 +0000 UTC m=+698.865801933" Oct 05 20:26:43 crc kubenswrapper[4753]: I1005 20:26:43.607235 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-78c6d655f5-9pcgt" Oct 05 20:27:03 crc kubenswrapper[4753]: I1005 20:27:03.371582 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-76575689f9-tr955" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.270515 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-gvxbk"] Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.273154 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.277587 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-b4sqj"] Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.278431 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b4sqj" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.280502 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-q9zz2" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.280746 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.280932 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.281157 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.289156 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-b4sqj"] Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.401281 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-z5l8d"] Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.402049 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-bbs9p"] Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.403917 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-z5l8d" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.411124 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.411307 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-c87mx" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.411371 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.411506 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.416431 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-bbs9p" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.418355 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.426562 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-bbs9p"] Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.465794 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5q95\" (UniqueName: \"kubernetes.io/projected/acdc2b18-6e82-4c5b-964f-b708c56c3704-kube-api-access-q5q95\") pod \"frr-k8s-webhook-server-64bf5d555-b4sqj\" (UID: \"acdc2b18-6e82-4c5b-964f-b708c56c3704\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b4sqj" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.465848 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-frr-startup\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.465888 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-metrics-certs\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.465915 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npqpz\" (UniqueName: \"kubernetes.io/projected/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-kube-api-access-npqpz\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.465941 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-reloader\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.465957 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-frr-conf\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.465984 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-frr-sockets\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.466000 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acdc2b18-6e82-4c5b-964f-b708c56c3704-cert\") pod \"frr-k8s-webhook-server-64bf5d555-b4sqj\" (UID: \"acdc2b18-6e82-4c5b-964f-b708c56c3704\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b4sqj" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.466020 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-metrics\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.489728 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.489783 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567486 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npqpz\" (UniqueName: \"kubernetes.io/projected/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-kube-api-access-npqpz\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567540 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-reloader\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567565 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snbtd\" (UniqueName: \"kubernetes.io/projected/e8a9ee2c-3d45-4169-b558-9c27e68cc25f-kube-api-access-snbtd\") pod \"controller-68d546b9d8-bbs9p\" (UID: \"e8a9ee2c-3d45-4169-b558-9c27e68cc25f\") " pod="metallb-system/controller-68d546b9d8-bbs9p" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567586 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-frr-conf\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567610 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-metrics-certs\") pod \"speaker-z5l8d\" (UID: \"bb160ebf-df7f-4e27-b5f5-0e108e377e5d\") " pod="metallb-system/speaker-z5l8d" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567633 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8a9ee2c-3d45-4169-b558-9c27e68cc25f-cert\") pod \"controller-68d546b9d8-bbs9p\" (UID: \"e8a9ee2c-3d45-4169-b558-9c27e68cc25f\") " pod="metallb-system/controller-68d546b9d8-bbs9p" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567653 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-frr-sockets\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567667 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-metallb-excludel2\") pod \"speaker-z5l8d\" (UID: \"bb160ebf-df7f-4e27-b5f5-0e108e377e5d\") " pod="metallb-system/speaker-z5l8d" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567681 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-memberlist\") pod \"speaker-z5l8d\" (UID: \"bb160ebf-df7f-4e27-b5f5-0e108e377e5d\") " pod="metallb-system/speaker-z5l8d" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567697 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acdc2b18-6e82-4c5b-964f-b708c56c3704-cert\") pod \"frr-k8s-webhook-server-64bf5d555-b4sqj\" (UID: \"acdc2b18-6e82-4c5b-964f-b708c56c3704\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b4sqj" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567719 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-metrics\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567745 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5q95\" (UniqueName: \"kubernetes.io/projected/acdc2b18-6e82-4c5b-964f-b708c56c3704-kube-api-access-q5q95\") pod \"frr-k8s-webhook-server-64bf5d555-b4sqj\" (UID: \"acdc2b18-6e82-4c5b-964f-b708c56c3704\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b4sqj" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567766 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-frr-startup\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567790 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8a9ee2c-3d45-4169-b558-9c27e68cc25f-metrics-certs\") pod \"controller-68d546b9d8-bbs9p\" (UID: \"e8a9ee2c-3d45-4169-b558-9c27e68cc25f\") " pod="metallb-system/controller-68d546b9d8-bbs9p" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567806 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjd5b\" (UniqueName: \"kubernetes.io/projected/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-kube-api-access-kjd5b\") pod \"speaker-z5l8d\" (UID: \"bb160ebf-df7f-4e27-b5f5-0e108e377e5d\") " pod="metallb-system/speaker-z5l8d" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567825 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-metrics-certs\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.567990 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-reloader\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.568222 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-frr-conf\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.568543 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-metrics\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.568556 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-frr-sockets\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.569326 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-frr-startup\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.573906 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acdc2b18-6e82-4c5b-964f-b708c56c3704-cert\") pod \"frr-k8s-webhook-server-64bf5d555-b4sqj\" (UID: \"acdc2b18-6e82-4c5b-964f-b708c56c3704\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b4sqj" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.585870 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5q95\" (UniqueName: \"kubernetes.io/projected/acdc2b18-6e82-4c5b-964f-b708c56c3704-kube-api-access-q5q95\") pod \"frr-k8s-webhook-server-64bf5d555-b4sqj\" (UID: \"acdc2b18-6e82-4c5b-964f-b708c56c3704\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b4sqj" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.586095 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-metrics-certs\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.595687 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npqpz\" (UniqueName: \"kubernetes.io/projected/e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54-kube-api-access-npqpz\") pod \"frr-k8s-gvxbk\" (UID: \"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54\") " pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.596655 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b4sqj" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.668553 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8a9ee2c-3d45-4169-b558-9c27e68cc25f-metrics-certs\") pod \"controller-68d546b9d8-bbs9p\" (UID: \"e8a9ee2c-3d45-4169-b558-9c27e68cc25f\") " pod="metallb-system/controller-68d546b9d8-bbs9p" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.668587 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjd5b\" (UniqueName: \"kubernetes.io/projected/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-kube-api-access-kjd5b\") pod \"speaker-z5l8d\" (UID: \"bb160ebf-df7f-4e27-b5f5-0e108e377e5d\") " pod="metallb-system/speaker-z5l8d" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.668631 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snbtd\" (UniqueName: \"kubernetes.io/projected/e8a9ee2c-3d45-4169-b558-9c27e68cc25f-kube-api-access-snbtd\") pod \"controller-68d546b9d8-bbs9p\" (UID: \"e8a9ee2c-3d45-4169-b558-9c27e68cc25f\") " pod="metallb-system/controller-68d546b9d8-bbs9p" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.668651 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-metrics-certs\") pod \"speaker-z5l8d\" (UID: \"bb160ebf-df7f-4e27-b5f5-0e108e377e5d\") " pod="metallb-system/speaker-z5l8d" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.668670 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8a9ee2c-3d45-4169-b558-9c27e68cc25f-cert\") pod \"controller-68d546b9d8-bbs9p\" (UID: \"e8a9ee2c-3d45-4169-b558-9c27e68cc25f\") " pod="metallb-system/controller-68d546b9d8-bbs9p" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.668691 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-metallb-excludel2\") pod \"speaker-z5l8d\" (UID: \"bb160ebf-df7f-4e27-b5f5-0e108e377e5d\") " pod="metallb-system/speaker-z5l8d" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.668706 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-memberlist\") pod \"speaker-z5l8d\" (UID: \"bb160ebf-df7f-4e27-b5f5-0e108e377e5d\") " pod="metallb-system/speaker-z5l8d" Oct 05 20:27:04 crc kubenswrapper[4753]: E1005 20:27:04.668817 4753 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 05 20:27:04 crc kubenswrapper[4753]: E1005 20:27:04.668868 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-memberlist podName:bb160ebf-df7f-4e27-b5f5-0e108e377e5d nodeName:}" failed. No retries permitted until 2025-10-05 20:27:05.16885281 +0000 UTC m=+734.017181042 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-memberlist") pod "speaker-z5l8d" (UID: "bb160ebf-df7f-4e27-b5f5-0e108e377e5d") : secret "metallb-memberlist" not found Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.672046 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-metallb-excludel2\") pod \"speaker-z5l8d\" (UID: \"bb160ebf-df7f-4e27-b5f5-0e108e377e5d\") " pod="metallb-system/speaker-z5l8d" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.677301 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.682805 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8a9ee2c-3d45-4169-b558-9c27e68cc25f-metrics-certs\") pod \"controller-68d546b9d8-bbs9p\" (UID: \"e8a9ee2c-3d45-4169-b558-9c27e68cc25f\") " pod="metallb-system/controller-68d546b9d8-bbs9p" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.688936 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8a9ee2c-3d45-4169-b558-9c27e68cc25f-cert\") pod \"controller-68d546b9d8-bbs9p\" (UID: \"e8a9ee2c-3d45-4169-b558-9c27e68cc25f\") " pod="metallb-system/controller-68d546b9d8-bbs9p" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.693253 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-metrics-certs\") pod \"speaker-z5l8d\" (UID: \"bb160ebf-df7f-4e27-b5f5-0e108e377e5d\") " pod="metallb-system/speaker-z5l8d" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.695429 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjd5b\" (UniqueName: \"kubernetes.io/projected/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-kube-api-access-kjd5b\") pod \"speaker-z5l8d\" (UID: \"bb160ebf-df7f-4e27-b5f5-0e108e377e5d\") " pod="metallb-system/speaker-z5l8d" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.698072 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snbtd\" (UniqueName: \"kubernetes.io/projected/e8a9ee2c-3d45-4169-b558-9c27e68cc25f-kube-api-access-snbtd\") pod \"controller-68d546b9d8-bbs9p\" (UID: \"e8a9ee2c-3d45-4169-b558-9c27e68cc25f\") " pod="metallb-system/controller-68d546b9d8-bbs9p" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.746948 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-bbs9p" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.890523 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:04 crc kubenswrapper[4753]: I1005 20:27:04.901300 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-b4sqj"] Oct 05 20:27:05 crc kubenswrapper[4753]: I1005 20:27:05.031716 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-bbs9p"] Oct 05 20:27:05 crc kubenswrapper[4753]: W1005 20:27:05.039326 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8a9ee2c_3d45_4169_b558_9c27e68cc25f.slice/crio-ae03ecea3538512a87c3dc4c27091428b94d1f80078ba6728cc41d57e3bb1cdf WatchSource:0}: Error finding container ae03ecea3538512a87c3dc4c27091428b94d1f80078ba6728cc41d57e3bb1cdf: Status 404 returned error can't find the container with id ae03ecea3538512a87c3dc4c27091428b94d1f80078ba6728cc41d57e3bb1cdf Oct 05 20:27:05 crc kubenswrapper[4753]: I1005 20:27:05.176443 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-memberlist\") pod \"speaker-z5l8d\" (UID: \"bb160ebf-df7f-4e27-b5f5-0e108e377e5d\") " pod="metallb-system/speaker-z5l8d" Oct 05 20:27:05 crc kubenswrapper[4753]: E1005 20:27:05.176601 4753 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 05 20:27:05 crc kubenswrapper[4753]: E1005 20:27:05.176662 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-memberlist podName:bb160ebf-df7f-4e27-b5f5-0e108e377e5d nodeName:}" failed. No retries permitted until 2025-10-05 20:27:06.176646628 +0000 UTC m=+735.024974850 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-memberlist") pod "speaker-z5l8d" (UID: "bb160ebf-df7f-4e27-b5f5-0e108e377e5d") : secret "metallb-memberlist" not found Oct 05 20:27:05 crc kubenswrapper[4753]: I1005 20:27:05.186205 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-bbs9p" event={"ID":"e8a9ee2c-3d45-4169-b558-9c27e68cc25f","Type":"ContainerStarted","Data":"32017f222b1633ab02cc4ed3c9ae5ef9e59bddc41449e65573f9cc5c97842f56"} Oct 05 20:27:05 crc kubenswrapper[4753]: I1005 20:27:05.186248 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-bbs9p" event={"ID":"e8a9ee2c-3d45-4169-b558-9c27e68cc25f","Type":"ContainerStarted","Data":"ae03ecea3538512a87c3dc4c27091428b94d1f80078ba6728cc41d57e3bb1cdf"} Oct 05 20:27:05 crc kubenswrapper[4753]: I1005 20:27:05.187562 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b4sqj" event={"ID":"acdc2b18-6e82-4c5b-964f-b708c56c3704","Type":"ContainerStarted","Data":"91632edb3353c8b85e3bb841e9dd0bf7656e727f4cdc2d2170118be6f27267e0"} Oct 05 20:27:05 crc kubenswrapper[4753]: I1005 20:27:05.188417 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gvxbk" event={"ID":"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54","Type":"ContainerStarted","Data":"3fa2762ded3d6482a28c45a5bc709292c97d62b12a6591f7e43c670eb809ba5a"} Oct 05 20:27:06 crc kubenswrapper[4753]: I1005 20:27:06.190602 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-memberlist\") pod \"speaker-z5l8d\" (UID: \"bb160ebf-df7f-4e27-b5f5-0e108e377e5d\") " pod="metallb-system/speaker-z5l8d" Oct 05 20:27:06 crc kubenswrapper[4753]: I1005 20:27:06.195380 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-bbs9p" event={"ID":"e8a9ee2c-3d45-4169-b558-9c27e68cc25f","Type":"ContainerStarted","Data":"a8e125ab9b66b994b6de7bd2d57fb4f449a91d3dadbc11115e9125908b721678"} Oct 05 20:27:06 crc kubenswrapper[4753]: I1005 20:27:06.195778 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-bbs9p" Oct 05 20:27:06 crc kubenswrapper[4753]: I1005 20:27:06.197878 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bb160ebf-df7f-4e27-b5f5-0e108e377e5d-memberlist\") pod \"speaker-z5l8d\" (UID: \"bb160ebf-df7f-4e27-b5f5-0e108e377e5d\") " pod="metallb-system/speaker-z5l8d" Oct 05 20:27:06 crc kubenswrapper[4753]: I1005 20:27:06.238058 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-z5l8d" Oct 05 20:27:06 crc kubenswrapper[4753]: W1005 20:27:06.277657 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb160ebf_df7f_4e27_b5f5_0e108e377e5d.slice/crio-3067b554c052dcd8ffe78dd029b6c25bd20f1570afebff451b85ab08ef1ba29c WatchSource:0}: Error finding container 3067b554c052dcd8ffe78dd029b6c25bd20f1570afebff451b85ab08ef1ba29c: Status 404 returned error can't find the container with id 3067b554c052dcd8ffe78dd029b6c25bd20f1570afebff451b85ab08ef1ba29c Oct 05 20:27:07 crc kubenswrapper[4753]: I1005 20:27:07.202083 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-z5l8d" event={"ID":"bb160ebf-df7f-4e27-b5f5-0e108e377e5d","Type":"ContainerStarted","Data":"ee0bb630e5f4f746ba2a34bba67573b46d1e89e2584053ed049265bcbcd9c902"} Oct 05 20:27:07 crc kubenswrapper[4753]: I1005 20:27:07.202401 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-z5l8d" event={"ID":"bb160ebf-df7f-4e27-b5f5-0e108e377e5d","Type":"ContainerStarted","Data":"c2d17d8922eb5285653990beb0092640658b8ee97e14ecb80c11865a64ca8541"} Oct 05 20:27:07 crc kubenswrapper[4753]: I1005 20:27:07.202413 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-z5l8d" event={"ID":"bb160ebf-df7f-4e27-b5f5-0e108e377e5d","Type":"ContainerStarted","Data":"3067b554c052dcd8ffe78dd029b6c25bd20f1570afebff451b85ab08ef1ba29c"} Oct 05 20:27:07 crc kubenswrapper[4753]: I1005 20:27:07.203290 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-z5l8d" Oct 05 20:27:07 crc kubenswrapper[4753]: I1005 20:27:07.240757 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-bbs9p" podStartSLOduration=3.24073527 podStartE2EDuration="3.24073527s" podCreationTimestamp="2025-10-05 20:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:27:06.220577702 +0000 UTC m=+735.068905974" watchObservedRunningTime="2025-10-05 20:27:07.24073527 +0000 UTC m=+736.089063512" Oct 05 20:27:07 crc kubenswrapper[4753]: I1005 20:27:07.241040 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-z5l8d" podStartSLOduration=3.24103277 podStartE2EDuration="3.24103277s" podCreationTimestamp="2025-10-05 20:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:27:07.238877313 +0000 UTC m=+736.087205535" watchObservedRunningTime="2025-10-05 20:27:07.24103277 +0000 UTC m=+736.089361002" Oct 05 20:27:14 crc kubenswrapper[4753]: I1005 20:27:14.236720 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b4sqj" event={"ID":"acdc2b18-6e82-4c5b-964f-b708c56c3704","Type":"ContainerStarted","Data":"28b4fe737301f5aa78f8af0704cc1e3adc7e13174c8145ce0c6815d280f1fa18"} Oct 05 20:27:14 crc kubenswrapper[4753]: I1005 20:27:14.237250 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b4sqj" Oct 05 20:27:14 crc kubenswrapper[4753]: I1005 20:27:14.238797 4753 generic.go:334] "Generic (PLEG): container finished" podID="e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54" containerID="184bebf76272b5110faaa36733d894bccf4492eea8547467850a1c7db0a161ff" exitCode=0 Oct 05 20:27:14 crc kubenswrapper[4753]: I1005 20:27:14.238842 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gvxbk" event={"ID":"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54","Type":"ContainerDied","Data":"184bebf76272b5110faaa36733d894bccf4492eea8547467850a1c7db0a161ff"} Oct 05 20:27:14 crc kubenswrapper[4753]: I1005 20:27:14.252725 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b4sqj" podStartSLOduration=1.768710615 podStartE2EDuration="10.252708263s" podCreationTimestamp="2025-10-05 20:27:04 +0000 UTC" firstStartedPulling="2025-10-05 20:27:04.926301139 +0000 UTC m=+733.774629371" lastFinishedPulling="2025-10-05 20:27:13.410298787 +0000 UTC m=+742.258627019" observedRunningTime="2025-10-05 20:27:14.251018111 +0000 UTC m=+743.099346343" watchObservedRunningTime="2025-10-05 20:27:14.252708263 +0000 UTC m=+743.101036495" Oct 05 20:27:15 crc kubenswrapper[4753]: I1005 20:27:15.245713 4753 generic.go:334] "Generic (PLEG): container finished" podID="e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54" containerID="06683f3db0a6d1abe05c3147ed781b9bf011b403314a15e73d3abd0694fe82a5" exitCode=0 Oct 05 20:27:15 crc kubenswrapper[4753]: I1005 20:27:15.246173 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gvxbk" event={"ID":"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54","Type":"ContainerDied","Data":"06683f3db0a6d1abe05c3147ed781b9bf011b403314a15e73d3abd0694fe82a5"} Oct 05 20:27:15 crc kubenswrapper[4753]: I1005 20:27:15.614318 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9jfj"] Oct 05 20:27:15 crc kubenswrapper[4753]: I1005 20:27:15.614703 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" podUID="6e73e9c7-3af4-4b10-a331-7899608702b3" containerName="controller-manager" containerID="cri-o://a094bddebbfc33e5c4739e6518c6ca2f92f9f90cd83c380bb7e7d18f17095679" gracePeriod=30 Oct 05 20:27:15 crc kubenswrapper[4753]: I1005 20:27:15.745376 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v"] Oct 05 20:27:15 crc kubenswrapper[4753]: I1005 20:27:15.745661 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" podUID="5615672e-59bf-448b-88ba-75a02438a8ad" containerName="route-controller-manager" containerID="cri-o://6edf788c0d9a70295f8b4c577424f1378038a37a0d1d9a8def15784b8bc61bdc" gracePeriod=30 Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.176337 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.189196 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.244612 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-z5l8d" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.253874 4753 generic.go:334] "Generic (PLEG): container finished" podID="6e73e9c7-3af4-4b10-a331-7899608702b3" containerID="a094bddebbfc33e5c4739e6518c6ca2f92f9f90cd83c380bb7e7d18f17095679" exitCode=0 Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.253928 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" event={"ID":"6e73e9c7-3af4-4b10-a331-7899608702b3","Type":"ContainerDied","Data":"a094bddebbfc33e5c4739e6518c6ca2f92f9f90cd83c380bb7e7d18f17095679"} Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.253955 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" event={"ID":"6e73e9c7-3af4-4b10-a331-7899608702b3","Type":"ContainerDied","Data":"aecff357a6639c01f4249264a28698e8b9f24e81b4039c03ae46a9d5aa221f95"} Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.253976 4753 scope.go:117] "RemoveContainer" containerID="a094bddebbfc33e5c4739e6518c6ca2f92f9f90cd83c380bb7e7d18f17095679" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.254070 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x9jfj" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.258166 4753 generic.go:334] "Generic (PLEG): container finished" podID="5615672e-59bf-448b-88ba-75a02438a8ad" containerID="6edf788c0d9a70295f8b4c577424f1378038a37a0d1d9a8def15784b8bc61bdc" exitCode=0 Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.258221 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" event={"ID":"5615672e-59bf-448b-88ba-75a02438a8ad","Type":"ContainerDied","Data":"6edf788c0d9a70295f8b4c577424f1378038a37a0d1d9a8def15784b8bc61bdc"} Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.258242 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" event={"ID":"5615672e-59bf-448b-88ba-75a02438a8ad","Type":"ContainerDied","Data":"4af13d8f75de63241690b46c2ae5b7f0ae37477f1a6d0121e44b05057a1bc417"} Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.258289 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.260269 4753 generic.go:334] "Generic (PLEG): container finished" podID="e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54" containerID="541390dbcf44ce564789337b80ebdcada705d8ce4b835fc98add6c54c8aa7b3f" exitCode=0 Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.260295 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gvxbk" event={"ID":"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54","Type":"ContainerDied","Data":"541390dbcf44ce564789337b80ebdcada705d8ce4b835fc98add6c54c8aa7b3f"} Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.280117 4753 scope.go:117] "RemoveContainer" containerID="a094bddebbfc33e5c4739e6518c6ca2f92f9f90cd83c380bb7e7d18f17095679" Oct 05 20:27:16 crc kubenswrapper[4753]: E1005 20:27:16.280580 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a094bddebbfc33e5c4739e6518c6ca2f92f9f90cd83c380bb7e7d18f17095679\": container with ID starting with a094bddebbfc33e5c4739e6518c6ca2f92f9f90cd83c380bb7e7d18f17095679 not found: ID does not exist" containerID="a094bddebbfc33e5c4739e6518c6ca2f92f9f90cd83c380bb7e7d18f17095679" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.280616 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a094bddebbfc33e5c4739e6518c6ca2f92f9f90cd83c380bb7e7d18f17095679"} err="failed to get container status \"a094bddebbfc33e5c4739e6518c6ca2f92f9f90cd83c380bb7e7d18f17095679\": rpc error: code = NotFound desc = could not find container \"a094bddebbfc33e5c4739e6518c6ca2f92f9f90cd83c380bb7e7d18f17095679\": container with ID starting with a094bddebbfc33e5c4739e6518c6ca2f92f9f90cd83c380bb7e7d18f17095679 not found: ID does not exist" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.280644 4753 scope.go:117] "RemoveContainer" containerID="6edf788c0d9a70295f8b4c577424f1378038a37a0d1d9a8def15784b8bc61bdc" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.332625 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5615672e-59bf-448b-88ba-75a02438a8ad-serving-cert\") pod \"5615672e-59bf-448b-88ba-75a02438a8ad\" (UID: \"5615672e-59bf-448b-88ba-75a02438a8ad\") " Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.332703 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5615672e-59bf-448b-88ba-75a02438a8ad-client-ca\") pod \"5615672e-59bf-448b-88ba-75a02438a8ad\" (UID: \"5615672e-59bf-448b-88ba-75a02438a8ad\") " Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.332728 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5615672e-59bf-448b-88ba-75a02438a8ad-config\") pod \"5615672e-59bf-448b-88ba-75a02438a8ad\" (UID: \"5615672e-59bf-448b-88ba-75a02438a8ad\") " Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.332780 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-proxy-ca-bundles\") pod \"6e73e9c7-3af4-4b10-a331-7899608702b3\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.332861 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nspjp\" (UniqueName: \"kubernetes.io/projected/6e73e9c7-3af4-4b10-a331-7899608702b3-kube-api-access-nspjp\") pod \"6e73e9c7-3af4-4b10-a331-7899608702b3\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.332883 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-config\") pod \"6e73e9c7-3af4-4b10-a331-7899608702b3\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.332904 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-client-ca\") pod \"6e73e9c7-3af4-4b10-a331-7899608702b3\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.332948 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e73e9c7-3af4-4b10-a331-7899608702b3-serving-cert\") pod \"6e73e9c7-3af4-4b10-a331-7899608702b3\" (UID: \"6e73e9c7-3af4-4b10-a331-7899608702b3\") " Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.332982 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxfn5\" (UniqueName: \"kubernetes.io/projected/5615672e-59bf-448b-88ba-75a02438a8ad-kube-api-access-cxfn5\") pod \"5615672e-59bf-448b-88ba-75a02438a8ad\" (UID: \"5615672e-59bf-448b-88ba-75a02438a8ad\") " Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.340530 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5615672e-59bf-448b-88ba-75a02438a8ad-kube-api-access-cxfn5" (OuterVolumeSpecName: "kube-api-access-cxfn5") pod "5615672e-59bf-448b-88ba-75a02438a8ad" (UID: "5615672e-59bf-448b-88ba-75a02438a8ad"). InnerVolumeSpecName "kube-api-access-cxfn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.341400 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-client-ca" (OuterVolumeSpecName: "client-ca") pod "6e73e9c7-3af4-4b10-a331-7899608702b3" (UID: "6e73e9c7-3af4-4b10-a331-7899608702b3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.341510 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-config" (OuterVolumeSpecName: "config") pod "6e73e9c7-3af4-4b10-a331-7899608702b3" (UID: "6e73e9c7-3af4-4b10-a331-7899608702b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.342173 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5615672e-59bf-448b-88ba-75a02438a8ad-config" (OuterVolumeSpecName: "config") pod "5615672e-59bf-448b-88ba-75a02438a8ad" (UID: "5615672e-59bf-448b-88ba-75a02438a8ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.342219 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5615672e-59bf-448b-88ba-75a02438a8ad-client-ca" (OuterVolumeSpecName: "client-ca") pod "5615672e-59bf-448b-88ba-75a02438a8ad" (UID: "5615672e-59bf-448b-88ba-75a02438a8ad"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.342654 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6e73e9c7-3af4-4b10-a331-7899608702b3" (UID: "6e73e9c7-3af4-4b10-a331-7899608702b3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.340794 4753 scope.go:117] "RemoveContainer" containerID="6edf788c0d9a70295f8b4c577424f1378038a37a0d1d9a8def15784b8bc61bdc" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.345742 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e73e9c7-3af4-4b10-a331-7899608702b3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6e73e9c7-3af4-4b10-a331-7899608702b3" (UID: "6e73e9c7-3af4-4b10-a331-7899608702b3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:27:16 crc kubenswrapper[4753]: E1005 20:27:16.346065 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6edf788c0d9a70295f8b4c577424f1378038a37a0d1d9a8def15784b8bc61bdc\": container with ID starting with 6edf788c0d9a70295f8b4c577424f1378038a37a0d1d9a8def15784b8bc61bdc not found: ID does not exist" containerID="6edf788c0d9a70295f8b4c577424f1378038a37a0d1d9a8def15784b8bc61bdc" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.346111 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6edf788c0d9a70295f8b4c577424f1378038a37a0d1d9a8def15784b8bc61bdc"} err="failed to get container status \"6edf788c0d9a70295f8b4c577424f1378038a37a0d1d9a8def15784b8bc61bdc\": rpc error: code = NotFound desc = could not find container \"6edf788c0d9a70295f8b4c577424f1378038a37a0d1d9a8def15784b8bc61bdc\": container with ID starting with 6edf788c0d9a70295f8b4c577424f1378038a37a0d1d9a8def15784b8bc61bdc not found: ID does not exist" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.351912 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5615672e-59bf-448b-88ba-75a02438a8ad-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5615672e-59bf-448b-88ba-75a02438a8ad" (UID: "5615672e-59bf-448b-88ba-75a02438a8ad"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.356895 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e73e9c7-3af4-4b10-a331-7899608702b3-kube-api-access-nspjp" (OuterVolumeSpecName: "kube-api-access-nspjp") pod "6e73e9c7-3af4-4b10-a331-7899608702b3" (UID: "6e73e9c7-3af4-4b10-a331-7899608702b3"). InnerVolumeSpecName "kube-api-access-nspjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.435044 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5615672e-59bf-448b-88ba-75a02438a8ad-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.435082 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5615672e-59bf-448b-88ba-75a02438a8ad-client-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.435091 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5615672e-59bf-448b-88ba-75a02438a8ad-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.435103 4753 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.435117 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nspjp\" (UniqueName: \"kubernetes.io/projected/6e73e9c7-3af4-4b10-a331-7899608702b3-kube-api-access-nspjp\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.435126 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.435133 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e73e9c7-3af4-4b10-a331-7899608702b3-client-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.435163 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e73e9c7-3af4-4b10-a331-7899608702b3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.435172 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxfn5\" (UniqueName: \"kubernetes.io/projected/5615672e-59bf-448b-88ba-75a02438a8ad-kube-api-access-cxfn5\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.601686 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9jfj"] Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.604571 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9jfj"] Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.618252 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v"] Oct 05 20:27:16 crc kubenswrapper[4753]: I1005 20:27:16.621507 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jcc7v"] Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.027512 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7cff895cf6-hg7l4"] Oct 05 20:27:17 crc kubenswrapper[4753]: E1005 20:27:17.028095 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e73e9c7-3af4-4b10-a331-7899608702b3" containerName="controller-manager" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.028112 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e73e9c7-3af4-4b10-a331-7899608702b3" containerName="controller-manager" Oct 05 20:27:17 crc kubenswrapper[4753]: E1005 20:27:17.028133 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5615672e-59bf-448b-88ba-75a02438a8ad" containerName="route-controller-manager" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.028156 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5615672e-59bf-448b-88ba-75a02438a8ad" containerName="route-controller-manager" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.028337 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e73e9c7-3af4-4b10-a331-7899608702b3" containerName="controller-manager" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.028347 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5615672e-59bf-448b-88ba-75a02438a8ad" containerName="route-controller-manager" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.028851 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.031824 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n"] Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.032500 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.035377 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.035563 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.035708 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.036103 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.043860 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.044115 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.044416 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.044690 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.044873 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.069958 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.070571 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.071351 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.071500 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.074148 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n"] Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.079472 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cff895cf6-hg7l4"] Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.142907 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxkbt\" (UniqueName: \"kubernetes.io/projected/2abf5820-8969-4d53-9093-3d1d6beb4438-kube-api-access-fxkbt\") pod \"route-controller-manager-74fc58ccc6-2rw9n\" (UID: \"2abf5820-8969-4d53-9093-3d1d6beb4438\") " pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.142955 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6ffw\" (UniqueName: \"kubernetes.io/projected/f2a6113a-f8ed-4eb3-959c-5f07fcefb182-kube-api-access-h6ffw\") pod \"controller-manager-7cff895cf6-hg7l4\" (UID: \"f2a6113a-f8ed-4eb3-959c-5f07fcefb182\") " pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.142984 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a6113a-f8ed-4eb3-959c-5f07fcefb182-config\") pod \"controller-manager-7cff895cf6-hg7l4\" (UID: \"f2a6113a-f8ed-4eb3-959c-5f07fcefb182\") " pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.143041 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2a6113a-f8ed-4eb3-959c-5f07fcefb182-client-ca\") pod \"controller-manager-7cff895cf6-hg7l4\" (UID: \"f2a6113a-f8ed-4eb3-959c-5f07fcefb182\") " pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.143065 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2abf5820-8969-4d53-9093-3d1d6beb4438-serving-cert\") pod \"route-controller-manager-74fc58ccc6-2rw9n\" (UID: \"2abf5820-8969-4d53-9093-3d1d6beb4438\") " pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.143082 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2a6113a-f8ed-4eb3-959c-5f07fcefb182-serving-cert\") pod \"controller-manager-7cff895cf6-hg7l4\" (UID: \"f2a6113a-f8ed-4eb3-959c-5f07fcefb182\") " pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.143100 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2a6113a-f8ed-4eb3-959c-5f07fcefb182-proxy-ca-bundles\") pod \"controller-manager-7cff895cf6-hg7l4\" (UID: \"f2a6113a-f8ed-4eb3-959c-5f07fcefb182\") " pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.143130 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2abf5820-8969-4d53-9093-3d1d6beb4438-client-ca\") pod \"route-controller-manager-74fc58ccc6-2rw9n\" (UID: \"2abf5820-8969-4d53-9093-3d1d6beb4438\") " pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.143166 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abf5820-8969-4d53-9093-3d1d6beb4438-config\") pod \"route-controller-manager-74fc58ccc6-2rw9n\" (UID: \"2abf5820-8969-4d53-9093-3d1d6beb4438\") " pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.244725 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2a6113a-f8ed-4eb3-959c-5f07fcefb182-client-ca\") pod \"controller-manager-7cff895cf6-hg7l4\" (UID: \"f2a6113a-f8ed-4eb3-959c-5f07fcefb182\") " pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.244785 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2abf5820-8969-4d53-9093-3d1d6beb4438-serving-cert\") pod \"route-controller-manager-74fc58ccc6-2rw9n\" (UID: \"2abf5820-8969-4d53-9093-3d1d6beb4438\") " pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.244816 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2a6113a-f8ed-4eb3-959c-5f07fcefb182-serving-cert\") pod \"controller-manager-7cff895cf6-hg7l4\" (UID: \"f2a6113a-f8ed-4eb3-959c-5f07fcefb182\") " pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.244833 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2a6113a-f8ed-4eb3-959c-5f07fcefb182-proxy-ca-bundles\") pod \"controller-manager-7cff895cf6-hg7l4\" (UID: \"f2a6113a-f8ed-4eb3-959c-5f07fcefb182\") " pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.244865 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2abf5820-8969-4d53-9093-3d1d6beb4438-client-ca\") pod \"route-controller-manager-74fc58ccc6-2rw9n\" (UID: \"2abf5820-8969-4d53-9093-3d1d6beb4438\") " pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.244889 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abf5820-8969-4d53-9093-3d1d6beb4438-config\") pod \"route-controller-manager-74fc58ccc6-2rw9n\" (UID: \"2abf5820-8969-4d53-9093-3d1d6beb4438\") " pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.244922 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxkbt\" (UniqueName: \"kubernetes.io/projected/2abf5820-8969-4d53-9093-3d1d6beb4438-kube-api-access-fxkbt\") pod \"route-controller-manager-74fc58ccc6-2rw9n\" (UID: \"2abf5820-8969-4d53-9093-3d1d6beb4438\") " pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.244942 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6ffw\" (UniqueName: \"kubernetes.io/projected/f2a6113a-f8ed-4eb3-959c-5f07fcefb182-kube-api-access-h6ffw\") pod \"controller-manager-7cff895cf6-hg7l4\" (UID: \"f2a6113a-f8ed-4eb3-959c-5f07fcefb182\") " pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.244960 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a6113a-f8ed-4eb3-959c-5f07fcefb182-config\") pod \"controller-manager-7cff895cf6-hg7l4\" (UID: \"f2a6113a-f8ed-4eb3-959c-5f07fcefb182\") " pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.245629 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2a6113a-f8ed-4eb3-959c-5f07fcefb182-client-ca\") pod \"controller-manager-7cff895cf6-hg7l4\" (UID: \"f2a6113a-f8ed-4eb3-959c-5f07fcefb182\") " pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.246257 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a6113a-f8ed-4eb3-959c-5f07fcefb182-config\") pod \"controller-manager-7cff895cf6-hg7l4\" (UID: \"f2a6113a-f8ed-4eb3-959c-5f07fcefb182\") " pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.246300 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2abf5820-8969-4d53-9093-3d1d6beb4438-client-ca\") pod \"route-controller-manager-74fc58ccc6-2rw9n\" (UID: \"2abf5820-8969-4d53-9093-3d1d6beb4438\") " pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.247000 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abf5820-8969-4d53-9093-3d1d6beb4438-config\") pod \"route-controller-manager-74fc58ccc6-2rw9n\" (UID: \"2abf5820-8969-4d53-9093-3d1d6beb4438\") " pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.247904 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2a6113a-f8ed-4eb3-959c-5f07fcefb182-proxy-ca-bundles\") pod \"controller-manager-7cff895cf6-hg7l4\" (UID: \"f2a6113a-f8ed-4eb3-959c-5f07fcefb182\") " pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.250449 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2a6113a-f8ed-4eb3-959c-5f07fcefb182-serving-cert\") pod \"controller-manager-7cff895cf6-hg7l4\" (UID: \"f2a6113a-f8ed-4eb3-959c-5f07fcefb182\") " pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.259551 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2abf5820-8969-4d53-9093-3d1d6beb4438-serving-cert\") pod \"route-controller-manager-74fc58ccc6-2rw9n\" (UID: \"2abf5820-8969-4d53-9093-3d1d6beb4438\") " pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.262309 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxkbt\" (UniqueName: \"kubernetes.io/projected/2abf5820-8969-4d53-9093-3d1d6beb4438-kube-api-access-fxkbt\") pod \"route-controller-manager-74fc58ccc6-2rw9n\" (UID: \"2abf5820-8969-4d53-9093-3d1d6beb4438\") " pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.263814 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6ffw\" (UniqueName: \"kubernetes.io/projected/f2a6113a-f8ed-4eb3-959c-5f07fcefb182-kube-api-access-h6ffw\") pod \"controller-manager-7cff895cf6-hg7l4\" (UID: \"f2a6113a-f8ed-4eb3-959c-5f07fcefb182\") " pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.270406 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gvxbk" event={"ID":"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54","Type":"ContainerStarted","Data":"f5aaad24d76ed42d90aec7e949ba10a122d31c5ed2d00e57630b4f50554cfefa"} Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.270433 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gvxbk" event={"ID":"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54","Type":"ContainerStarted","Data":"6c1c4dc7e12bd09e6f7d1009b88187d245ad7cd32b333a2e9a6426e93d856a77"} Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.270445 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gvxbk" event={"ID":"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54","Type":"ContainerStarted","Data":"f059de2233f523c1a7c0065cb251a0cf63396dbad51f7ecac6ddc181a67f21d3"} Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.270481 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gvxbk" event={"ID":"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54","Type":"ContainerStarted","Data":"5270846ea45f300eb46b327021170a12bcf165b909657df31611efd7f47a4b7c"} Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.270490 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gvxbk" event={"ID":"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54","Type":"ContainerStarted","Data":"67f8479c26787c88e17209ce85731bec336e22ee8feabc110e98885b70847756"} Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.382323 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.390239 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.613624 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cff895cf6-hg7l4"] Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.660234 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n"] Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.860111 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5615672e-59bf-448b-88ba-75a02438a8ad" path="/var/lib/kubelet/pods/5615672e-59bf-448b-88ba-75a02438a8ad/volumes" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.860755 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e73e9c7-3af4-4b10-a331-7899608702b3" path="/var/lib/kubelet/pods/6e73e9c7-3af4-4b10-a331-7899608702b3/volumes" Oct 05 20:27:17 crc kubenswrapper[4753]: I1005 20:27:17.890391 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n"] Oct 05 20:27:18 crc kubenswrapper[4753]: I1005 20:27:18.279857 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gvxbk" event={"ID":"e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54","Type":"ContainerStarted","Data":"d3fb87696422237fec9469b2d6f9c9701e78e3e0e9c4b079b3d4455e2e6798c9"} Oct 05 20:27:18 crc kubenswrapper[4753]: I1005 20:27:18.280576 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:18 crc kubenswrapper[4753]: I1005 20:27:18.281471 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" event={"ID":"2abf5820-8969-4d53-9093-3d1d6beb4438","Type":"ContainerStarted","Data":"433e508aeea7c7a295e283dcdce27fac3b47e3670b950864450fffa1e347c26e"} Oct 05 20:27:18 crc kubenswrapper[4753]: I1005 20:27:18.282922 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" event={"ID":"f2a6113a-f8ed-4eb3-959c-5f07fcefb182","Type":"ContainerStarted","Data":"11f0226616e87665d17dd9f503124d973c3cd645cbf0f9a4945a4e5b723f6144"} Oct 05 20:27:18 crc kubenswrapper[4753]: I1005 20:27:18.303226 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-gvxbk" podStartSLOduration=5.88496309 podStartE2EDuration="14.303211573s" podCreationTimestamp="2025-10-05 20:27:04 +0000 UTC" firstStartedPulling="2025-10-05 20:27:05.019855274 +0000 UTC m=+733.868183506" lastFinishedPulling="2025-10-05 20:27:13.438103757 +0000 UTC m=+742.286431989" observedRunningTime="2025-10-05 20:27:18.301426998 +0000 UTC m=+747.149755270" watchObservedRunningTime="2025-10-05 20:27:18.303211573 +0000 UTC m=+747.151539805" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.290358 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" event={"ID":"f2a6113a-f8ed-4eb3-959c-5f07fcefb182","Type":"ContainerStarted","Data":"fe064fdc0818eb06a60d92fb2b6ce4ea18e7da5a1f19f5d0c9c847a297e33df1"} Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.292099 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.294927 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" podUID="2abf5820-8969-4d53-9093-3d1d6beb4438" containerName="route-controller-manager" containerID="cri-o://acd6c760b4dee31c3e90632445824338da9bd1c13846b20fdf25d38d2f3dfd22" gracePeriod=30 Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.295172 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" event={"ID":"2abf5820-8969-4d53-9093-3d1d6beb4438","Type":"ContainerStarted","Data":"acd6c760b4dee31c3e90632445824338da9bd1c13846b20fdf25d38d2f3dfd22"} Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.295214 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.295464 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.302448 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.341938 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7cff895cf6-hg7l4" podStartSLOduration=4.341924026 podStartE2EDuration="4.341924026s" podCreationTimestamp="2025-10-05 20:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:27:19.311927887 +0000 UTC m=+748.160256129" watchObservedRunningTime="2025-10-05 20:27:19.341924026 +0000 UTC m=+748.190252258" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.360372 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" podStartSLOduration=4.360354196 podStartE2EDuration="4.360354196s" podCreationTimestamp="2025-10-05 20:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:27:19.360096768 +0000 UTC m=+748.208425000" watchObservedRunningTime="2025-10-05 20:27:19.360354196 +0000 UTC m=+748.208682428" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.565154 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-v94tn"] Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.566041 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v94tn" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.569504 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.570122 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.588290 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkqjb\" (UniqueName: \"kubernetes.io/projected/a76c319d-5028-4dc2-8383-62571fdcb0c5-kube-api-access-qkqjb\") pod \"openstack-operator-index-v94tn\" (UID: \"a76c319d-5028-4dc2-8383-62571fdcb0c5\") " pod="openstack-operators/openstack-operator-index-v94tn" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.595817 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v94tn"] Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.689374 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkqjb\" (UniqueName: \"kubernetes.io/projected/a76c319d-5028-4dc2-8383-62571fdcb0c5-kube-api-access-qkqjb\") pod \"openstack-operator-index-v94tn\" (UID: \"a76c319d-5028-4dc2-8383-62571fdcb0c5\") " pod="openstack-operators/openstack-operator-index-v94tn" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.711211 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkqjb\" (UniqueName: \"kubernetes.io/projected/a76c319d-5028-4dc2-8383-62571fdcb0c5-kube-api-access-qkqjb\") pod \"openstack-operator-index-v94tn\" (UID: \"a76c319d-5028-4dc2-8383-62571fdcb0c5\") " pod="openstack-operators/openstack-operator-index-v94tn" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.751821 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.790570 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxkbt\" (UniqueName: \"kubernetes.io/projected/2abf5820-8969-4d53-9093-3d1d6beb4438-kube-api-access-fxkbt\") pod \"2abf5820-8969-4d53-9093-3d1d6beb4438\" (UID: \"2abf5820-8969-4d53-9093-3d1d6beb4438\") " Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.790622 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2abf5820-8969-4d53-9093-3d1d6beb4438-serving-cert\") pod \"2abf5820-8969-4d53-9093-3d1d6beb4438\" (UID: \"2abf5820-8969-4d53-9093-3d1d6beb4438\") " Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.790639 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2abf5820-8969-4d53-9093-3d1d6beb4438-client-ca\") pod \"2abf5820-8969-4d53-9093-3d1d6beb4438\" (UID: \"2abf5820-8969-4d53-9093-3d1d6beb4438\") " Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.790704 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abf5820-8969-4d53-9093-3d1d6beb4438-config\") pod \"2abf5820-8969-4d53-9093-3d1d6beb4438\" (UID: \"2abf5820-8969-4d53-9093-3d1d6beb4438\") " Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.791653 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abf5820-8969-4d53-9093-3d1d6beb4438-config" (OuterVolumeSpecName: "config") pod "2abf5820-8969-4d53-9093-3d1d6beb4438" (UID: "2abf5820-8969-4d53-9093-3d1d6beb4438"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.792718 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abf5820-8969-4d53-9093-3d1d6beb4438-client-ca" (OuterVolumeSpecName: "client-ca") pod "2abf5820-8969-4d53-9093-3d1d6beb4438" (UID: "2abf5820-8969-4d53-9093-3d1d6beb4438"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.797530 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2abf5820-8969-4d53-9093-3d1d6beb4438-kube-api-access-fxkbt" (OuterVolumeSpecName: "kube-api-access-fxkbt") pod "2abf5820-8969-4d53-9093-3d1d6beb4438" (UID: "2abf5820-8969-4d53-9093-3d1d6beb4438"). InnerVolumeSpecName "kube-api-access-fxkbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.810460 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2abf5820-8969-4d53-9093-3d1d6beb4438-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2abf5820-8969-4d53-9093-3d1d6beb4438" (UID: "2abf5820-8969-4d53-9093-3d1d6beb4438"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.820662 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x"] Oct 05 20:27:19 crc kubenswrapper[4753]: E1005 20:27:19.820901 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abf5820-8969-4d53-9093-3d1d6beb4438" containerName="route-controller-manager" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.820916 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abf5820-8969-4d53-9093-3d1d6beb4438" containerName="route-controller-manager" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.821026 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abf5820-8969-4d53-9093-3d1d6beb4438" containerName="route-controller-manager" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.821541 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.866082 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x"] Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.889624 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v94tn" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.891531 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.891850 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4091ac1a-fcd8-4407-b6c5-59378df4b4f4-config\") pod \"route-controller-manager-5df644bf6-pp48x\" (UID: \"4091ac1a-fcd8-4407-b6c5-59378df4b4f4\") " pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.891891 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4091ac1a-fcd8-4407-b6c5-59378df4b4f4-serving-cert\") pod \"route-controller-manager-5df644bf6-pp48x\" (UID: \"4091ac1a-fcd8-4407-b6c5-59378df4b4f4\") " pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.891931 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmbx2\" (UniqueName: \"kubernetes.io/projected/4091ac1a-fcd8-4407-b6c5-59378df4b4f4-kube-api-access-qmbx2\") pod \"route-controller-manager-5df644bf6-pp48x\" (UID: \"4091ac1a-fcd8-4407-b6c5-59378df4b4f4\") " pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.891947 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4091ac1a-fcd8-4407-b6c5-59378df4b4f4-client-ca\") pod \"route-controller-manager-5df644bf6-pp48x\" (UID: \"4091ac1a-fcd8-4407-b6c5-59378df4b4f4\") " pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.892015 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxkbt\" (UniqueName: \"kubernetes.io/projected/2abf5820-8969-4d53-9093-3d1d6beb4438-kube-api-access-fxkbt\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.892028 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2abf5820-8969-4d53-9093-3d1d6beb4438-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.892038 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2abf5820-8969-4d53-9093-3d1d6beb4438-client-ca\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.892046 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abf5820-8969-4d53-9093-3d1d6beb4438-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.992723 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmbx2\" (UniqueName: \"kubernetes.io/projected/4091ac1a-fcd8-4407-b6c5-59378df4b4f4-kube-api-access-qmbx2\") pod \"route-controller-manager-5df644bf6-pp48x\" (UID: \"4091ac1a-fcd8-4407-b6c5-59378df4b4f4\") " pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.992762 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4091ac1a-fcd8-4407-b6c5-59378df4b4f4-client-ca\") pod \"route-controller-manager-5df644bf6-pp48x\" (UID: \"4091ac1a-fcd8-4407-b6c5-59378df4b4f4\") " pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.992840 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4091ac1a-fcd8-4407-b6c5-59378df4b4f4-config\") pod \"route-controller-manager-5df644bf6-pp48x\" (UID: \"4091ac1a-fcd8-4407-b6c5-59378df4b4f4\") " pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.992859 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4091ac1a-fcd8-4407-b6c5-59378df4b4f4-serving-cert\") pod \"route-controller-manager-5df644bf6-pp48x\" (UID: \"4091ac1a-fcd8-4407-b6c5-59378df4b4f4\") " pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.994232 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4091ac1a-fcd8-4407-b6c5-59378df4b4f4-client-ca\") pod \"route-controller-manager-5df644bf6-pp48x\" (UID: \"4091ac1a-fcd8-4407-b6c5-59378df4b4f4\") " pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" Oct 05 20:27:19 crc kubenswrapper[4753]: I1005 20:27:19.995583 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4091ac1a-fcd8-4407-b6c5-59378df4b4f4-config\") pod \"route-controller-manager-5df644bf6-pp48x\" (UID: \"4091ac1a-fcd8-4407-b6c5-59378df4b4f4\") " pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" Oct 05 20:27:20 crc kubenswrapper[4753]: I1005 20:27:20.001725 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4091ac1a-fcd8-4407-b6c5-59378df4b4f4-serving-cert\") pod \"route-controller-manager-5df644bf6-pp48x\" (UID: \"4091ac1a-fcd8-4407-b6c5-59378df4b4f4\") " pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" Oct 05 20:27:20 crc kubenswrapper[4753]: I1005 20:27:20.029815 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmbx2\" (UniqueName: \"kubernetes.io/projected/4091ac1a-fcd8-4407-b6c5-59378df4b4f4-kube-api-access-qmbx2\") pod \"route-controller-manager-5df644bf6-pp48x\" (UID: \"4091ac1a-fcd8-4407-b6c5-59378df4b4f4\") " pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" Oct 05 20:27:20 crc kubenswrapper[4753]: I1005 20:27:20.153656 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:20 crc kubenswrapper[4753]: I1005 20:27:20.162271 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" Oct 05 20:27:20 crc kubenswrapper[4753]: I1005 20:27:20.322954 4753 generic.go:334] "Generic (PLEG): container finished" podID="2abf5820-8969-4d53-9093-3d1d6beb4438" containerID="acd6c760b4dee31c3e90632445824338da9bd1c13846b20fdf25d38d2f3dfd22" exitCode=0 Oct 05 20:27:20 crc kubenswrapper[4753]: I1005 20:27:20.323680 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" Oct 05 20:27:20 crc kubenswrapper[4753]: I1005 20:27:20.324070 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" event={"ID":"2abf5820-8969-4d53-9093-3d1d6beb4438","Type":"ContainerDied","Data":"acd6c760b4dee31c3e90632445824338da9bd1c13846b20fdf25d38d2f3dfd22"} Oct 05 20:27:20 crc kubenswrapper[4753]: I1005 20:27:20.324102 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n" event={"ID":"2abf5820-8969-4d53-9093-3d1d6beb4438","Type":"ContainerDied","Data":"433e508aeea7c7a295e283dcdce27fac3b47e3670b950864450fffa1e347c26e"} Oct 05 20:27:20 crc kubenswrapper[4753]: I1005 20:27:20.324117 4753 scope.go:117] "RemoveContainer" containerID="acd6c760b4dee31c3e90632445824338da9bd1c13846b20fdf25d38d2f3dfd22" Oct 05 20:27:20 crc kubenswrapper[4753]: I1005 20:27:20.357188 4753 scope.go:117] "RemoveContainer" containerID="acd6c760b4dee31c3e90632445824338da9bd1c13846b20fdf25d38d2f3dfd22" Oct 05 20:27:20 crc kubenswrapper[4753]: E1005 20:27:20.358646 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acd6c760b4dee31c3e90632445824338da9bd1c13846b20fdf25d38d2f3dfd22\": container with ID starting with acd6c760b4dee31c3e90632445824338da9bd1c13846b20fdf25d38d2f3dfd22 not found: ID does not exist" containerID="acd6c760b4dee31c3e90632445824338da9bd1c13846b20fdf25d38d2f3dfd22" Oct 05 20:27:20 crc kubenswrapper[4753]: I1005 20:27:20.358682 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acd6c760b4dee31c3e90632445824338da9bd1c13846b20fdf25d38d2f3dfd22"} err="failed to get container status \"acd6c760b4dee31c3e90632445824338da9bd1c13846b20fdf25d38d2f3dfd22\": rpc error: code = NotFound desc = could not find container \"acd6c760b4dee31c3e90632445824338da9bd1c13846b20fdf25d38d2f3dfd22\": container with ID starting with acd6c760b4dee31c3e90632445824338da9bd1c13846b20fdf25d38d2f3dfd22 not found: ID does not exist" Oct 05 20:27:20 crc kubenswrapper[4753]: I1005 20:27:20.375500 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v94tn"] Oct 05 20:27:20 crc kubenswrapper[4753]: I1005 20:27:20.392903 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n"] Oct 05 20:27:20 crc kubenswrapper[4753]: I1005 20:27:20.406439 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74fc58ccc6-2rw9n"] Oct 05 20:27:20 crc kubenswrapper[4753]: I1005 20:27:20.519941 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x"] Oct 05 20:27:21 crc kubenswrapper[4753]: I1005 20:27:21.333381 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v94tn" event={"ID":"a76c319d-5028-4dc2-8383-62571fdcb0c5","Type":"ContainerStarted","Data":"7bf386a46fe9a6138e671aac684fe9f2a0a8334c73bef050fd73f90ab2daf47b"} Oct 05 20:27:21 crc kubenswrapper[4753]: I1005 20:27:21.335176 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" event={"ID":"4091ac1a-fcd8-4407-b6c5-59378df4b4f4","Type":"ContainerStarted","Data":"419a7be9fa86028c7a20b4f6fa192e407eb28854b553f9ce56cde8bbb9da1e37"} Oct 05 20:27:21 crc kubenswrapper[4753]: I1005 20:27:21.335230 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" event={"ID":"4091ac1a-fcd8-4407-b6c5-59378df4b4f4","Type":"ContainerStarted","Data":"8908f04529bb03e533d47f02d2e352aba9efaa42683b8067cc6a7b0a815bf095"} Oct 05 20:27:21 crc kubenswrapper[4753]: I1005 20:27:21.335741 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" Oct 05 20:27:21 crc kubenswrapper[4753]: I1005 20:27:21.352611 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" podStartSLOduration=4.352595865 podStartE2EDuration="4.352595865s" podCreationTimestamp="2025-10-05 20:27:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:27:21.352061938 +0000 UTC m=+750.200390170" watchObservedRunningTime="2025-10-05 20:27:21.352595865 +0000 UTC m=+750.200924097" Oct 05 20:27:21 crc kubenswrapper[4753]: I1005 20:27:21.415848 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5df644bf6-pp48x" Oct 05 20:27:21 crc kubenswrapper[4753]: I1005 20:27:21.860630 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2abf5820-8969-4d53-9093-3d1d6beb4438" path="/var/lib/kubelet/pods/2abf5820-8969-4d53-9093-3d1d6beb4438/volumes" Oct 05 20:27:22 crc kubenswrapper[4753]: I1005 20:27:22.341281 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v94tn" event={"ID":"a76c319d-5028-4dc2-8383-62571fdcb0c5","Type":"ContainerStarted","Data":"261ff7e1ab4220ff457b8f0e35d66ea8a33842403fee129011f17856128cf8de"} Oct 05 20:27:22 crc kubenswrapper[4753]: I1005 20:27:22.354701 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-v94tn" podStartSLOduration=2.49460624 podStartE2EDuration="3.354681173s" podCreationTimestamp="2025-10-05 20:27:19 +0000 UTC" firstStartedPulling="2025-10-05 20:27:20.390406751 +0000 UTC m=+749.238734983" lastFinishedPulling="2025-10-05 20:27:21.250481684 +0000 UTC m=+750.098809916" observedRunningTime="2025-10-05 20:27:22.353184717 +0000 UTC m=+751.201512959" watchObservedRunningTime="2025-10-05 20:27:22.354681173 +0000 UTC m=+751.203009425" Oct 05 20:27:23 crc kubenswrapper[4753]: I1005 20:27:23.733353 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-v94tn"] Oct 05 20:27:24 crc kubenswrapper[4753]: I1005 20:27:24.338023 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jswp7"] Oct 05 20:27:24 crc kubenswrapper[4753]: I1005 20:27:24.338701 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jswp7" Oct 05 20:27:24 crc kubenswrapper[4753]: I1005 20:27:24.341823 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-jgk7g" Oct 05 20:27:24 crc kubenswrapper[4753]: I1005 20:27:24.349561 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt9gc\" (UniqueName: \"kubernetes.io/projected/a7f848da-2bdd-4f76-bc85-94cbb95bd680-kube-api-access-jt9gc\") pod \"openstack-operator-index-jswp7\" (UID: \"a7f848da-2bdd-4f76-bc85-94cbb95bd680\") " pod="openstack-operators/openstack-operator-index-jswp7" Oct 05 20:27:24 crc kubenswrapper[4753]: I1005 20:27:24.351065 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-v94tn" podUID="a76c319d-5028-4dc2-8383-62571fdcb0c5" containerName="registry-server" containerID="cri-o://261ff7e1ab4220ff457b8f0e35d66ea8a33842403fee129011f17856128cf8de" gracePeriod=2 Oct 05 20:27:24 crc kubenswrapper[4753]: I1005 20:27:24.352891 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jswp7"] Oct 05 20:27:24 crc kubenswrapper[4753]: I1005 20:27:24.451170 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt9gc\" (UniqueName: \"kubernetes.io/projected/a7f848da-2bdd-4f76-bc85-94cbb95bd680-kube-api-access-jt9gc\") pod \"openstack-operator-index-jswp7\" (UID: \"a7f848da-2bdd-4f76-bc85-94cbb95bd680\") " pod="openstack-operators/openstack-operator-index-jswp7" Oct 05 20:27:24 crc kubenswrapper[4753]: I1005 20:27:24.467938 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt9gc\" (UniqueName: \"kubernetes.io/projected/a7f848da-2bdd-4f76-bc85-94cbb95bd680-kube-api-access-jt9gc\") pod \"openstack-operator-index-jswp7\" (UID: \"a7f848da-2bdd-4f76-bc85-94cbb95bd680\") " pod="openstack-operators/openstack-operator-index-jswp7" Oct 05 20:27:24 crc kubenswrapper[4753]: I1005 20:27:24.601212 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b4sqj" Oct 05 20:27:24 crc kubenswrapper[4753]: I1005 20:27:24.657454 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jswp7" Oct 05 20:27:24 crc kubenswrapper[4753]: I1005 20:27:24.752117 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-bbs9p" Oct 05 20:27:24 crc kubenswrapper[4753]: I1005 20:27:24.797128 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v94tn" Oct 05 20:27:24 crc kubenswrapper[4753]: I1005 20:27:24.962852 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkqjb\" (UniqueName: \"kubernetes.io/projected/a76c319d-5028-4dc2-8383-62571fdcb0c5-kube-api-access-qkqjb\") pod \"a76c319d-5028-4dc2-8383-62571fdcb0c5\" (UID: \"a76c319d-5028-4dc2-8383-62571fdcb0c5\") " Oct 05 20:27:24 crc kubenswrapper[4753]: I1005 20:27:24.968634 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a76c319d-5028-4dc2-8383-62571fdcb0c5-kube-api-access-qkqjb" (OuterVolumeSpecName: "kube-api-access-qkqjb") pod "a76c319d-5028-4dc2-8383-62571fdcb0c5" (UID: "a76c319d-5028-4dc2-8383-62571fdcb0c5"). InnerVolumeSpecName "kube-api-access-qkqjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:27:25 crc kubenswrapper[4753]: I1005 20:27:25.064159 4753 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 05 20:27:25 crc kubenswrapper[4753]: I1005 20:27:25.064768 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkqjb\" (UniqueName: \"kubernetes.io/projected/a76c319d-5028-4dc2-8383-62571fdcb0c5-kube-api-access-qkqjb\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:25 crc kubenswrapper[4753]: I1005 20:27:25.109744 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jswp7"] Oct 05 20:27:25 crc kubenswrapper[4753]: W1005 20:27:25.117492 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7f848da_2bdd_4f76_bc85_94cbb95bd680.slice/crio-2aacbd10dbddb22abc413039958aa79be0bef8ded76c28d0e2bdfbcf7b4d55f3 WatchSource:0}: Error finding container 2aacbd10dbddb22abc413039958aa79be0bef8ded76c28d0e2bdfbcf7b4d55f3: Status 404 returned error can't find the container with id 2aacbd10dbddb22abc413039958aa79be0bef8ded76c28d0e2bdfbcf7b4d55f3 Oct 05 20:27:25 crc kubenswrapper[4753]: I1005 20:27:25.358453 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jswp7" event={"ID":"a7f848da-2bdd-4f76-bc85-94cbb95bd680","Type":"ContainerStarted","Data":"2aacbd10dbddb22abc413039958aa79be0bef8ded76c28d0e2bdfbcf7b4d55f3"} Oct 05 20:27:25 crc kubenswrapper[4753]: I1005 20:27:25.360331 4753 generic.go:334] "Generic (PLEG): container finished" podID="a76c319d-5028-4dc2-8383-62571fdcb0c5" containerID="261ff7e1ab4220ff457b8f0e35d66ea8a33842403fee129011f17856128cf8de" exitCode=0 Oct 05 20:27:25 crc kubenswrapper[4753]: I1005 20:27:25.360371 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v94tn" event={"ID":"a76c319d-5028-4dc2-8383-62571fdcb0c5","Type":"ContainerDied","Data":"261ff7e1ab4220ff457b8f0e35d66ea8a33842403fee129011f17856128cf8de"} Oct 05 20:27:25 crc kubenswrapper[4753]: I1005 20:27:25.360392 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v94tn" Oct 05 20:27:25 crc kubenswrapper[4753]: I1005 20:27:25.360690 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v94tn" event={"ID":"a76c319d-5028-4dc2-8383-62571fdcb0c5","Type":"ContainerDied","Data":"7bf386a46fe9a6138e671aac684fe9f2a0a8334c73bef050fd73f90ab2daf47b"} Oct 05 20:27:25 crc kubenswrapper[4753]: I1005 20:27:25.360897 4753 scope.go:117] "RemoveContainer" containerID="261ff7e1ab4220ff457b8f0e35d66ea8a33842403fee129011f17856128cf8de" Oct 05 20:27:25 crc kubenswrapper[4753]: I1005 20:27:25.382093 4753 scope.go:117] "RemoveContainer" containerID="261ff7e1ab4220ff457b8f0e35d66ea8a33842403fee129011f17856128cf8de" Oct 05 20:27:25 crc kubenswrapper[4753]: E1005 20:27:25.382589 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"261ff7e1ab4220ff457b8f0e35d66ea8a33842403fee129011f17856128cf8de\": container with ID starting with 261ff7e1ab4220ff457b8f0e35d66ea8a33842403fee129011f17856128cf8de not found: ID does not exist" containerID="261ff7e1ab4220ff457b8f0e35d66ea8a33842403fee129011f17856128cf8de" Oct 05 20:27:25 crc kubenswrapper[4753]: I1005 20:27:25.382632 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"261ff7e1ab4220ff457b8f0e35d66ea8a33842403fee129011f17856128cf8de"} err="failed to get container status \"261ff7e1ab4220ff457b8f0e35d66ea8a33842403fee129011f17856128cf8de\": rpc error: code = NotFound desc = could not find container \"261ff7e1ab4220ff457b8f0e35d66ea8a33842403fee129011f17856128cf8de\": container with ID starting with 261ff7e1ab4220ff457b8f0e35d66ea8a33842403fee129011f17856128cf8de not found: ID does not exist" Oct 05 20:27:25 crc kubenswrapper[4753]: I1005 20:27:25.394095 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-v94tn"] Oct 05 20:27:25 crc kubenswrapper[4753]: I1005 20:27:25.402416 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-v94tn"] Oct 05 20:27:25 crc kubenswrapper[4753]: I1005 20:27:25.858782 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a76c319d-5028-4dc2-8383-62571fdcb0c5" path="/var/lib/kubelet/pods/a76c319d-5028-4dc2-8383-62571fdcb0c5/volumes" Oct 05 20:27:26 crc kubenswrapper[4753]: I1005 20:27:26.367592 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jswp7" event={"ID":"a7f848da-2bdd-4f76-bc85-94cbb95bd680","Type":"ContainerStarted","Data":"1a6e089aa180e152f39fa2d5f29f401b27f145dc39509708327e2a7dec750be0"} Oct 05 20:27:26 crc kubenswrapper[4753]: I1005 20:27:26.385208 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jswp7" podStartSLOduration=1.923914187 podStartE2EDuration="2.385193165s" podCreationTimestamp="2025-10-05 20:27:24 +0000 UTC" firstStartedPulling="2025-10-05 20:27:25.121513319 +0000 UTC m=+753.969841551" lastFinishedPulling="2025-10-05 20:27:25.582792297 +0000 UTC m=+754.431120529" observedRunningTime="2025-10-05 20:27:26.384790632 +0000 UTC m=+755.233118854" watchObservedRunningTime="2025-10-05 20:27:26.385193165 +0000 UTC m=+755.233521397" Oct 05 20:27:31 crc kubenswrapper[4753]: I1005 20:27:31.542828 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nsxbf"] Oct 05 20:27:31 crc kubenswrapper[4753]: E1005 20:27:31.544210 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76c319d-5028-4dc2-8383-62571fdcb0c5" containerName="registry-server" Oct 05 20:27:31 crc kubenswrapper[4753]: I1005 20:27:31.544235 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76c319d-5028-4dc2-8383-62571fdcb0c5" containerName="registry-server" Oct 05 20:27:31 crc kubenswrapper[4753]: I1005 20:27:31.544416 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a76c319d-5028-4dc2-8383-62571fdcb0c5" containerName="registry-server" Oct 05 20:27:31 crc kubenswrapper[4753]: I1005 20:27:31.545546 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:31 crc kubenswrapper[4753]: I1005 20:27:31.598657 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nsxbf"] Oct 05 20:27:31 crc kubenswrapper[4753]: I1005 20:27:31.648475 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41700734-fc8e-4943-9e12-1cf73b727a65-catalog-content\") pod \"redhat-marketplace-nsxbf\" (UID: \"41700734-fc8e-4943-9e12-1cf73b727a65\") " pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:31 crc kubenswrapper[4753]: I1005 20:27:31.648516 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41700734-fc8e-4943-9e12-1cf73b727a65-utilities\") pod \"redhat-marketplace-nsxbf\" (UID: \"41700734-fc8e-4943-9e12-1cf73b727a65\") " pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:31 crc kubenswrapper[4753]: I1005 20:27:31.648548 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9h6z\" (UniqueName: \"kubernetes.io/projected/41700734-fc8e-4943-9e12-1cf73b727a65-kube-api-access-w9h6z\") pod \"redhat-marketplace-nsxbf\" (UID: \"41700734-fc8e-4943-9e12-1cf73b727a65\") " pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:31 crc kubenswrapper[4753]: I1005 20:27:31.749352 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9h6z\" (UniqueName: \"kubernetes.io/projected/41700734-fc8e-4943-9e12-1cf73b727a65-kube-api-access-w9h6z\") pod \"redhat-marketplace-nsxbf\" (UID: \"41700734-fc8e-4943-9e12-1cf73b727a65\") " pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:31 crc kubenswrapper[4753]: I1005 20:27:31.749457 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41700734-fc8e-4943-9e12-1cf73b727a65-catalog-content\") pod \"redhat-marketplace-nsxbf\" (UID: \"41700734-fc8e-4943-9e12-1cf73b727a65\") " pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:31 crc kubenswrapper[4753]: I1005 20:27:31.749473 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41700734-fc8e-4943-9e12-1cf73b727a65-utilities\") pod \"redhat-marketplace-nsxbf\" (UID: \"41700734-fc8e-4943-9e12-1cf73b727a65\") " pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:31 crc kubenswrapper[4753]: I1005 20:27:31.749938 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41700734-fc8e-4943-9e12-1cf73b727a65-utilities\") pod \"redhat-marketplace-nsxbf\" (UID: \"41700734-fc8e-4943-9e12-1cf73b727a65\") " pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:31 crc kubenswrapper[4753]: I1005 20:27:31.750177 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41700734-fc8e-4943-9e12-1cf73b727a65-catalog-content\") pod \"redhat-marketplace-nsxbf\" (UID: \"41700734-fc8e-4943-9e12-1cf73b727a65\") " pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:31 crc kubenswrapper[4753]: I1005 20:27:31.770483 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9h6z\" (UniqueName: \"kubernetes.io/projected/41700734-fc8e-4943-9e12-1cf73b727a65-kube-api-access-w9h6z\") pod \"redhat-marketplace-nsxbf\" (UID: \"41700734-fc8e-4943-9e12-1cf73b727a65\") " pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:31 crc kubenswrapper[4753]: I1005 20:27:31.867681 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:32 crc kubenswrapper[4753]: I1005 20:27:32.286314 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nsxbf"] Oct 05 20:27:32 crc kubenswrapper[4753]: W1005 20:27:32.296481 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41700734_fc8e_4943_9e12_1cf73b727a65.slice/crio-1e87ed120c299e68506044f7435de0ea4ca3450cf96cad615d374e54cbde8b10 WatchSource:0}: Error finding container 1e87ed120c299e68506044f7435de0ea4ca3450cf96cad615d374e54cbde8b10: Status 404 returned error can't find the container with id 1e87ed120c299e68506044f7435de0ea4ca3450cf96cad615d374e54cbde8b10 Oct 05 20:27:32 crc kubenswrapper[4753]: I1005 20:27:32.417560 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nsxbf" event={"ID":"41700734-fc8e-4943-9e12-1cf73b727a65","Type":"ContainerStarted","Data":"1e87ed120c299e68506044f7435de0ea4ca3450cf96cad615d374e54cbde8b10"} Oct 05 20:27:33 crc kubenswrapper[4753]: I1005 20:27:33.425332 4753 generic.go:334] "Generic (PLEG): container finished" podID="41700734-fc8e-4943-9e12-1cf73b727a65" containerID="9d4230bef2e4231f662c891f52e1b5bf990c51b51b1df8f7f629a2d1972f96b9" exitCode=0 Oct 05 20:27:33 crc kubenswrapper[4753]: I1005 20:27:33.425512 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nsxbf" event={"ID":"41700734-fc8e-4943-9e12-1cf73b727a65","Type":"ContainerDied","Data":"9d4230bef2e4231f662c891f52e1b5bf990c51b51b1df8f7f629a2d1972f96b9"} Oct 05 20:27:34 crc kubenswrapper[4753]: I1005 20:27:34.435878 4753 generic.go:334] "Generic (PLEG): container finished" podID="41700734-fc8e-4943-9e12-1cf73b727a65" containerID="ec5762c74cb4ec114aae95dbb7c1aee0e42e9df53ec38dde60ec67719ad53486" exitCode=0 Oct 05 20:27:34 crc kubenswrapper[4753]: I1005 20:27:34.435981 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nsxbf" event={"ID":"41700734-fc8e-4943-9e12-1cf73b727a65","Type":"ContainerDied","Data":"ec5762c74cb4ec114aae95dbb7c1aee0e42e9df53ec38dde60ec67719ad53486"} Oct 05 20:27:34 crc kubenswrapper[4753]: I1005 20:27:34.490079 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:27:34 crc kubenswrapper[4753]: I1005 20:27:34.490130 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:27:34 crc kubenswrapper[4753]: I1005 20:27:34.658854 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-jswp7" Oct 05 20:27:34 crc kubenswrapper[4753]: I1005 20:27:34.659214 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-jswp7" Oct 05 20:27:34 crc kubenswrapper[4753]: I1005 20:27:34.693863 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-jswp7" Oct 05 20:27:34 crc kubenswrapper[4753]: I1005 20:27:34.894047 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-gvxbk" Oct 05 20:27:35 crc kubenswrapper[4753]: I1005 20:27:35.446204 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nsxbf" event={"ID":"41700734-fc8e-4943-9e12-1cf73b727a65","Type":"ContainerStarted","Data":"afcfd3618009d41f435657295650ad5f78ae5df4a75a2bab2bb0a20185e13eee"} Oct 05 20:27:35 crc kubenswrapper[4753]: I1005 20:27:35.462251 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nsxbf" podStartSLOduration=3.017938331 podStartE2EDuration="4.462236638s" podCreationTimestamp="2025-10-05 20:27:31 +0000 UTC" firstStartedPulling="2025-10-05 20:27:33.427187445 +0000 UTC m=+762.275515687" lastFinishedPulling="2025-10-05 20:27:34.871485722 +0000 UTC m=+763.719813994" observedRunningTime="2025-10-05 20:27:35.461880508 +0000 UTC m=+764.310208740" watchObservedRunningTime="2025-10-05 20:27:35.462236638 +0000 UTC m=+764.310564870" Oct 05 20:27:35 crc kubenswrapper[4753]: I1005 20:27:35.488025 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-jswp7" Oct 05 20:27:41 crc kubenswrapper[4753]: I1005 20:27:41.810079 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574"] Oct 05 20:27:41 crc kubenswrapper[4753]: I1005 20:27:41.816807 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" Oct 05 20:27:41 crc kubenswrapper[4753]: I1005 20:27:41.820947 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574"] Oct 05 20:27:41 crc kubenswrapper[4753]: I1005 20:27:41.823389 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-m6jj7" Oct 05 20:27:41 crc kubenswrapper[4753]: I1005 20:27:41.868029 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:41 crc kubenswrapper[4753]: I1005 20:27:41.868083 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:41 crc kubenswrapper[4753]: I1005 20:27:41.887628 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-bundle\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574\" (UID: \"95cb6a2f-3fb4-43d6-b342-c17059bf0e73\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" Oct 05 20:27:41 crc kubenswrapper[4753]: I1005 20:27:41.887888 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmb6q\" (UniqueName: \"kubernetes.io/projected/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-kube-api-access-rmb6q\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574\" (UID: \"95cb6a2f-3fb4-43d6-b342-c17059bf0e73\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" Oct 05 20:27:41 crc kubenswrapper[4753]: I1005 20:27:41.888045 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-util\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574\" (UID: \"95cb6a2f-3fb4-43d6-b342-c17059bf0e73\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" Oct 05 20:27:41 crc kubenswrapper[4753]: I1005 20:27:41.923004 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:41 crc kubenswrapper[4753]: I1005 20:27:41.989352 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-bundle\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574\" (UID: \"95cb6a2f-3fb4-43d6-b342-c17059bf0e73\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" Oct 05 20:27:41 crc kubenswrapper[4753]: I1005 20:27:41.989466 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmb6q\" (UniqueName: \"kubernetes.io/projected/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-kube-api-access-rmb6q\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574\" (UID: \"95cb6a2f-3fb4-43d6-b342-c17059bf0e73\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" Oct 05 20:27:41 crc kubenswrapper[4753]: I1005 20:27:41.989538 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-util\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574\" (UID: \"95cb6a2f-3fb4-43d6-b342-c17059bf0e73\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" Oct 05 20:27:41 crc kubenswrapper[4753]: I1005 20:27:41.990263 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-bundle\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574\" (UID: \"95cb6a2f-3fb4-43d6-b342-c17059bf0e73\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" Oct 05 20:27:41 crc kubenswrapper[4753]: I1005 20:27:41.991076 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-util\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574\" (UID: \"95cb6a2f-3fb4-43d6-b342-c17059bf0e73\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" Oct 05 20:27:42 crc kubenswrapper[4753]: I1005 20:27:42.015352 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmb6q\" (UniqueName: \"kubernetes.io/projected/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-kube-api-access-rmb6q\") pod \"0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574\" (UID: \"95cb6a2f-3fb4-43d6-b342-c17059bf0e73\") " pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" Oct 05 20:27:42 crc kubenswrapper[4753]: I1005 20:27:42.144808 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" Oct 05 20:27:42 crc kubenswrapper[4753]: I1005 20:27:42.558406 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:42 crc kubenswrapper[4753]: I1005 20:27:42.590298 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574"] Oct 05 20:27:43 crc kubenswrapper[4753]: I1005 20:27:43.512451 4753 generic.go:334] "Generic (PLEG): container finished" podID="95cb6a2f-3fb4-43d6-b342-c17059bf0e73" containerID="07f72c68d9cd256b75d4b0fc680992b646ee62576f1281834544d3897278a7d4" exitCode=0 Oct 05 20:27:43 crc kubenswrapper[4753]: I1005 20:27:43.512545 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" event={"ID":"95cb6a2f-3fb4-43d6-b342-c17059bf0e73","Type":"ContainerDied","Data":"07f72c68d9cd256b75d4b0fc680992b646ee62576f1281834544d3897278a7d4"} Oct 05 20:27:43 crc kubenswrapper[4753]: I1005 20:27:43.512576 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" event={"ID":"95cb6a2f-3fb4-43d6-b342-c17059bf0e73","Type":"ContainerStarted","Data":"08582e66172c5183372f478bb24256dc61aadbbce3b630e4a5f651a3f93e9c7b"} Oct 05 20:27:45 crc kubenswrapper[4753]: I1005 20:27:45.134216 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nsxbf"] Oct 05 20:27:45 crc kubenswrapper[4753]: I1005 20:27:45.134419 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nsxbf" podUID="41700734-fc8e-4943-9e12-1cf73b727a65" containerName="registry-server" containerID="cri-o://afcfd3618009d41f435657295650ad5f78ae5df4a75a2bab2bb0a20185e13eee" gracePeriod=2 Oct 05 20:27:45 crc kubenswrapper[4753]: I1005 20:27:45.530494 4753 generic.go:334] "Generic (PLEG): container finished" podID="41700734-fc8e-4943-9e12-1cf73b727a65" containerID="afcfd3618009d41f435657295650ad5f78ae5df4a75a2bab2bb0a20185e13eee" exitCode=0 Oct 05 20:27:45 crc kubenswrapper[4753]: I1005 20:27:45.530544 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nsxbf" event={"ID":"41700734-fc8e-4943-9e12-1cf73b727a65","Type":"ContainerDied","Data":"afcfd3618009d41f435657295650ad5f78ae5df4a75a2bab2bb0a20185e13eee"} Oct 05 20:27:45 crc kubenswrapper[4753]: I1005 20:27:45.531782 4753 generic.go:334] "Generic (PLEG): container finished" podID="95cb6a2f-3fb4-43d6-b342-c17059bf0e73" containerID="61efbc6acab0214a8d31f4077eb25d8ec6325b144ad567b81cfc768c0eb96cdb" exitCode=0 Oct 05 20:27:45 crc kubenswrapper[4753]: I1005 20:27:45.531808 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" event={"ID":"95cb6a2f-3fb4-43d6-b342-c17059bf0e73","Type":"ContainerDied","Data":"61efbc6acab0214a8d31f4077eb25d8ec6325b144ad567b81cfc768c0eb96cdb"} Oct 05 20:27:45 crc kubenswrapper[4753]: I1005 20:27:45.636093 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:45 crc kubenswrapper[4753]: I1005 20:27:45.737089 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9h6z\" (UniqueName: \"kubernetes.io/projected/41700734-fc8e-4943-9e12-1cf73b727a65-kube-api-access-w9h6z\") pod \"41700734-fc8e-4943-9e12-1cf73b727a65\" (UID: \"41700734-fc8e-4943-9e12-1cf73b727a65\") " Oct 05 20:27:45 crc kubenswrapper[4753]: I1005 20:27:45.737529 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41700734-fc8e-4943-9e12-1cf73b727a65-utilities\") pod \"41700734-fc8e-4943-9e12-1cf73b727a65\" (UID: \"41700734-fc8e-4943-9e12-1cf73b727a65\") " Oct 05 20:27:45 crc kubenswrapper[4753]: I1005 20:27:45.737626 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41700734-fc8e-4943-9e12-1cf73b727a65-catalog-content\") pod \"41700734-fc8e-4943-9e12-1cf73b727a65\" (UID: \"41700734-fc8e-4943-9e12-1cf73b727a65\") " Oct 05 20:27:45 crc kubenswrapper[4753]: I1005 20:27:45.739001 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41700734-fc8e-4943-9e12-1cf73b727a65-utilities" (OuterVolumeSpecName: "utilities") pod "41700734-fc8e-4943-9e12-1cf73b727a65" (UID: "41700734-fc8e-4943-9e12-1cf73b727a65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:27:45 crc kubenswrapper[4753]: I1005 20:27:45.752120 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41700734-fc8e-4943-9e12-1cf73b727a65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41700734-fc8e-4943-9e12-1cf73b727a65" (UID: "41700734-fc8e-4943-9e12-1cf73b727a65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:27:45 crc kubenswrapper[4753]: I1005 20:27:45.753336 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41700734-fc8e-4943-9e12-1cf73b727a65-kube-api-access-w9h6z" (OuterVolumeSpecName: "kube-api-access-w9h6z") pod "41700734-fc8e-4943-9e12-1cf73b727a65" (UID: "41700734-fc8e-4943-9e12-1cf73b727a65"). InnerVolumeSpecName "kube-api-access-w9h6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:27:45 crc kubenswrapper[4753]: I1005 20:27:45.839092 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41700734-fc8e-4943-9e12-1cf73b727a65-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:45 crc kubenswrapper[4753]: I1005 20:27:45.839132 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41700734-fc8e-4943-9e12-1cf73b727a65-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:45 crc kubenswrapper[4753]: I1005 20:27:45.839156 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9h6z\" (UniqueName: \"kubernetes.io/projected/41700734-fc8e-4943-9e12-1cf73b727a65-kube-api-access-w9h6z\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:46 crc kubenswrapper[4753]: I1005 20:27:46.539983 4753 generic.go:334] "Generic (PLEG): container finished" podID="95cb6a2f-3fb4-43d6-b342-c17059bf0e73" containerID="cb77a6989d64865172c6abb686cf73ff977b5f304672dfc9fe11731f92ffee5f" exitCode=0 Oct 05 20:27:46 crc kubenswrapper[4753]: I1005 20:27:46.540439 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" event={"ID":"95cb6a2f-3fb4-43d6-b342-c17059bf0e73","Type":"ContainerDied","Data":"cb77a6989d64865172c6abb686cf73ff977b5f304672dfc9fe11731f92ffee5f"} Oct 05 20:27:46 crc kubenswrapper[4753]: I1005 20:27:46.542781 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nsxbf" event={"ID":"41700734-fc8e-4943-9e12-1cf73b727a65","Type":"ContainerDied","Data":"1e87ed120c299e68506044f7435de0ea4ca3450cf96cad615d374e54cbde8b10"} Oct 05 20:27:46 crc kubenswrapper[4753]: I1005 20:27:46.542885 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nsxbf" Oct 05 20:27:46 crc kubenswrapper[4753]: I1005 20:27:46.542904 4753 scope.go:117] "RemoveContainer" containerID="afcfd3618009d41f435657295650ad5f78ae5df4a75a2bab2bb0a20185e13eee" Oct 05 20:27:46 crc kubenswrapper[4753]: I1005 20:27:46.558793 4753 scope.go:117] "RemoveContainer" containerID="ec5762c74cb4ec114aae95dbb7c1aee0e42e9df53ec38dde60ec67719ad53486" Oct 05 20:27:46 crc kubenswrapper[4753]: I1005 20:27:46.579116 4753 scope.go:117] "RemoveContainer" containerID="9d4230bef2e4231f662c891f52e1b5bf990c51b51b1df8f7f629a2d1972f96b9" Oct 05 20:27:46 crc kubenswrapper[4753]: I1005 20:27:46.585248 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nsxbf"] Oct 05 20:27:46 crc kubenswrapper[4753]: I1005 20:27:46.588778 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nsxbf"] Oct 05 20:27:47 crc kubenswrapper[4753]: I1005 20:27:47.800288 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" Oct 05 20:27:47 crc kubenswrapper[4753]: I1005 20:27:47.859428 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41700734-fc8e-4943-9e12-1cf73b727a65" path="/var/lib/kubelet/pods/41700734-fc8e-4943-9e12-1cf73b727a65/volumes" Oct 05 20:27:47 crc kubenswrapper[4753]: I1005 20:27:47.969762 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-bundle\") pod \"95cb6a2f-3fb4-43d6-b342-c17059bf0e73\" (UID: \"95cb6a2f-3fb4-43d6-b342-c17059bf0e73\") " Oct 05 20:27:47 crc kubenswrapper[4753]: I1005 20:27:47.969808 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-util\") pod \"95cb6a2f-3fb4-43d6-b342-c17059bf0e73\" (UID: \"95cb6a2f-3fb4-43d6-b342-c17059bf0e73\") " Oct 05 20:27:47 crc kubenswrapper[4753]: I1005 20:27:47.969893 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmb6q\" (UniqueName: \"kubernetes.io/projected/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-kube-api-access-rmb6q\") pod \"95cb6a2f-3fb4-43d6-b342-c17059bf0e73\" (UID: \"95cb6a2f-3fb4-43d6-b342-c17059bf0e73\") " Oct 05 20:27:47 crc kubenswrapper[4753]: I1005 20:27:47.970482 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-bundle" (OuterVolumeSpecName: "bundle") pod "95cb6a2f-3fb4-43d6-b342-c17059bf0e73" (UID: "95cb6a2f-3fb4-43d6-b342-c17059bf0e73"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:27:47 crc kubenswrapper[4753]: I1005 20:27:47.975288 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-kube-api-access-rmb6q" (OuterVolumeSpecName: "kube-api-access-rmb6q") pod "95cb6a2f-3fb4-43d6-b342-c17059bf0e73" (UID: "95cb6a2f-3fb4-43d6-b342-c17059bf0e73"). InnerVolumeSpecName "kube-api-access-rmb6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.071678 4753 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.071707 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmb6q\" (UniqueName: \"kubernetes.io/projected/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-kube-api-access-rmb6q\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.271028 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-util" (OuterVolumeSpecName: "util") pod "95cb6a2f-3fb4-43d6-b342-c17059bf0e73" (UID: "95cb6a2f-3fb4-43d6-b342-c17059bf0e73"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.273979 4753 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95cb6a2f-3fb4-43d6-b342-c17059bf0e73-util\") on node \"crc\" DevicePath \"\"" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.560204 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" event={"ID":"95cb6a2f-3fb4-43d6-b342-c17059bf0e73","Type":"ContainerDied","Data":"08582e66172c5183372f478bb24256dc61aadbbce3b630e4a5f651a3f93e9c7b"} Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.560259 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08582e66172c5183372f478bb24256dc61aadbbce3b630e4a5f651a3f93e9c7b" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.560345 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.743847 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8rxdq"] Oct 05 20:27:48 crc kubenswrapper[4753]: E1005 20:27:48.744101 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41700734-fc8e-4943-9e12-1cf73b727a65" containerName="extract-utilities" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.744118 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="41700734-fc8e-4943-9e12-1cf73b727a65" containerName="extract-utilities" Oct 05 20:27:48 crc kubenswrapper[4753]: E1005 20:27:48.744133 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95cb6a2f-3fb4-43d6-b342-c17059bf0e73" containerName="extract" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.744158 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="95cb6a2f-3fb4-43d6-b342-c17059bf0e73" containerName="extract" Oct 05 20:27:48 crc kubenswrapper[4753]: E1005 20:27:48.744170 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41700734-fc8e-4943-9e12-1cf73b727a65" containerName="extract-content" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.744178 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="41700734-fc8e-4943-9e12-1cf73b727a65" containerName="extract-content" Oct 05 20:27:48 crc kubenswrapper[4753]: E1005 20:27:48.744194 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41700734-fc8e-4943-9e12-1cf73b727a65" containerName="registry-server" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.744203 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="41700734-fc8e-4943-9e12-1cf73b727a65" containerName="registry-server" Oct 05 20:27:48 crc kubenswrapper[4753]: E1005 20:27:48.744222 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95cb6a2f-3fb4-43d6-b342-c17059bf0e73" containerName="util" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.744232 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="95cb6a2f-3fb4-43d6-b342-c17059bf0e73" containerName="util" Oct 05 20:27:48 crc kubenswrapper[4753]: E1005 20:27:48.744242 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95cb6a2f-3fb4-43d6-b342-c17059bf0e73" containerName="pull" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.744249 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="95cb6a2f-3fb4-43d6-b342-c17059bf0e73" containerName="pull" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.744375 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="95cb6a2f-3fb4-43d6-b342-c17059bf0e73" containerName="extract" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.744388 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="41700734-fc8e-4943-9e12-1cf73b727a65" containerName="registry-server" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.745092 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.756500 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rxdq"] Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.881047 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/564930ae-a228-4e79-9378-7c307b70a4fd-catalog-content\") pod \"certified-operators-8rxdq\" (UID: \"564930ae-a228-4e79-9378-7c307b70a4fd\") " pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.881202 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/564930ae-a228-4e79-9378-7c307b70a4fd-utilities\") pod \"certified-operators-8rxdq\" (UID: \"564930ae-a228-4e79-9378-7c307b70a4fd\") " pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.881282 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctxkq\" (UniqueName: \"kubernetes.io/projected/564930ae-a228-4e79-9378-7c307b70a4fd-kube-api-access-ctxkq\") pod \"certified-operators-8rxdq\" (UID: \"564930ae-a228-4e79-9378-7c307b70a4fd\") " pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.982073 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/564930ae-a228-4e79-9378-7c307b70a4fd-catalog-content\") pod \"certified-operators-8rxdq\" (UID: \"564930ae-a228-4e79-9378-7c307b70a4fd\") " pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.982202 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/564930ae-a228-4e79-9378-7c307b70a4fd-utilities\") pod \"certified-operators-8rxdq\" (UID: \"564930ae-a228-4e79-9378-7c307b70a4fd\") " pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.982241 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctxkq\" (UniqueName: \"kubernetes.io/projected/564930ae-a228-4e79-9378-7c307b70a4fd-kube-api-access-ctxkq\") pod \"certified-operators-8rxdq\" (UID: \"564930ae-a228-4e79-9378-7c307b70a4fd\") " pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.982685 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/564930ae-a228-4e79-9378-7c307b70a4fd-catalog-content\") pod \"certified-operators-8rxdq\" (UID: \"564930ae-a228-4e79-9378-7c307b70a4fd\") " pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:27:48 crc kubenswrapper[4753]: I1005 20:27:48.982878 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/564930ae-a228-4e79-9378-7c307b70a4fd-utilities\") pod \"certified-operators-8rxdq\" (UID: \"564930ae-a228-4e79-9378-7c307b70a4fd\") " pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:27:49 crc kubenswrapper[4753]: I1005 20:27:49.006128 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctxkq\" (UniqueName: \"kubernetes.io/projected/564930ae-a228-4e79-9378-7c307b70a4fd-kube-api-access-ctxkq\") pod \"certified-operators-8rxdq\" (UID: \"564930ae-a228-4e79-9378-7c307b70a4fd\") " pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:27:49 crc kubenswrapper[4753]: I1005 20:27:49.063804 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:27:49 crc kubenswrapper[4753]: I1005 20:27:49.416500 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rxdq"] Oct 05 20:27:49 crc kubenswrapper[4753]: W1005 20:27:49.434264 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod564930ae_a228_4e79_9378_7c307b70a4fd.slice/crio-4a57ef1912ad5293260b2d558c21dbe9882b407a812941154a5d4337ab17bb56 WatchSource:0}: Error finding container 4a57ef1912ad5293260b2d558c21dbe9882b407a812941154a5d4337ab17bb56: Status 404 returned error can't find the container with id 4a57ef1912ad5293260b2d558c21dbe9882b407a812941154a5d4337ab17bb56 Oct 05 20:27:49 crc kubenswrapper[4753]: I1005 20:27:49.568733 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxdq" event={"ID":"564930ae-a228-4e79-9378-7c307b70a4fd","Type":"ContainerStarted","Data":"4a57ef1912ad5293260b2d558c21dbe9882b407a812941154a5d4337ab17bb56"} Oct 05 20:27:50 crc kubenswrapper[4753]: I1005 20:27:50.102733 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-677d5bb784-5l45z"] Oct 05 20:27:50 crc kubenswrapper[4753]: I1005 20:27:50.103620 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-5l45z" Oct 05 20:27:50 crc kubenswrapper[4753]: I1005 20:27:50.109401 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-75kfh" Oct 05 20:27:50 crc kubenswrapper[4753]: I1005 20:27:50.128368 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-677d5bb784-5l45z"] Oct 05 20:27:50 crc kubenswrapper[4753]: I1005 20:27:50.195699 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4888j\" (UniqueName: \"kubernetes.io/projected/22ffe795-4cc5-4c86-9ae6-04999586c7de-kube-api-access-4888j\") pod \"openstack-operator-controller-operator-677d5bb784-5l45z\" (UID: \"22ffe795-4cc5-4c86-9ae6-04999586c7de\") " pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-5l45z" Oct 05 20:27:50 crc kubenswrapper[4753]: I1005 20:27:50.297591 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4888j\" (UniqueName: \"kubernetes.io/projected/22ffe795-4cc5-4c86-9ae6-04999586c7de-kube-api-access-4888j\") pod \"openstack-operator-controller-operator-677d5bb784-5l45z\" (UID: \"22ffe795-4cc5-4c86-9ae6-04999586c7de\") " pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-5l45z" Oct 05 20:27:50 crc kubenswrapper[4753]: I1005 20:27:50.323180 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4888j\" (UniqueName: \"kubernetes.io/projected/22ffe795-4cc5-4c86-9ae6-04999586c7de-kube-api-access-4888j\") pod \"openstack-operator-controller-operator-677d5bb784-5l45z\" (UID: \"22ffe795-4cc5-4c86-9ae6-04999586c7de\") " pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-5l45z" Oct 05 20:27:50 crc kubenswrapper[4753]: I1005 20:27:50.419851 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-5l45z" Oct 05 20:27:50 crc kubenswrapper[4753]: I1005 20:27:50.579891 4753 generic.go:334] "Generic (PLEG): container finished" podID="564930ae-a228-4e79-9378-7c307b70a4fd" containerID="316a2819408df0ec54a7b98b491ba6800294d08ebdcb924bcd9d31f63babb3f0" exitCode=0 Oct 05 20:27:50 crc kubenswrapper[4753]: I1005 20:27:50.579933 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxdq" event={"ID":"564930ae-a228-4e79-9378-7c307b70a4fd","Type":"ContainerDied","Data":"316a2819408df0ec54a7b98b491ba6800294d08ebdcb924bcd9d31f63babb3f0"} Oct 05 20:27:50 crc kubenswrapper[4753]: I1005 20:27:50.872368 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-677d5bb784-5l45z"] Oct 05 20:27:50 crc kubenswrapper[4753]: W1005 20:27:50.880040 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22ffe795_4cc5_4c86_9ae6_04999586c7de.slice/crio-5524441b4e9df126f0afc6037c5484f979e0c418cc2f1fd9da74ee7746685bb0 WatchSource:0}: Error finding container 5524441b4e9df126f0afc6037c5484f979e0c418cc2f1fd9da74ee7746685bb0: Status 404 returned error can't find the container with id 5524441b4e9df126f0afc6037c5484f979e0c418cc2f1fd9da74ee7746685bb0 Oct 05 20:27:51 crc kubenswrapper[4753]: I1005 20:27:51.588221 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxdq" event={"ID":"564930ae-a228-4e79-9378-7c307b70a4fd","Type":"ContainerStarted","Data":"0a0de94783c58a56fc0bdd9befa5eca24a47792ab10b98814ff795299c43fbe9"} Oct 05 20:27:51 crc kubenswrapper[4753]: I1005 20:27:51.592170 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-5l45z" event={"ID":"22ffe795-4cc5-4c86-9ae6-04999586c7de","Type":"ContainerStarted","Data":"5524441b4e9df126f0afc6037c5484f979e0c418cc2f1fd9da74ee7746685bb0"} Oct 05 20:27:52 crc kubenswrapper[4753]: I1005 20:27:52.601397 4753 generic.go:334] "Generic (PLEG): container finished" podID="564930ae-a228-4e79-9378-7c307b70a4fd" containerID="0a0de94783c58a56fc0bdd9befa5eca24a47792ab10b98814ff795299c43fbe9" exitCode=0 Oct 05 20:27:52 crc kubenswrapper[4753]: I1005 20:27:52.601723 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxdq" event={"ID":"564930ae-a228-4e79-9378-7c307b70a4fd","Type":"ContainerDied","Data":"0a0de94783c58a56fc0bdd9befa5eca24a47792ab10b98814ff795299c43fbe9"} Oct 05 20:27:56 crc kubenswrapper[4753]: I1005 20:27:56.628451 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxdq" event={"ID":"564930ae-a228-4e79-9378-7c307b70a4fd","Type":"ContainerStarted","Data":"0920951cabc037b9c7ecd9f3c83cb72a5afec8f4809afa320bfcf0f766458144"} Oct 05 20:27:56 crc kubenswrapper[4753]: I1005 20:27:56.639315 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-5l45z" event={"ID":"22ffe795-4cc5-4c86-9ae6-04999586c7de","Type":"ContainerStarted","Data":"2e2c3ef64ba227e5ca2cb915760b7e41bfafaa2534ee9e9be6894702e5460fd3"} Oct 05 20:27:58 crc kubenswrapper[4753]: I1005 20:27:58.653720 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-5l45z" event={"ID":"22ffe795-4cc5-4c86-9ae6-04999586c7de","Type":"ContainerStarted","Data":"2ce1409edd5651e0000e892f6b7b27af259b391509c222360419ef85457c3c84"} Oct 05 20:27:58 crc kubenswrapper[4753]: I1005 20:27:58.654911 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-5l45z" Oct 05 20:27:58 crc kubenswrapper[4753]: I1005 20:27:58.702516 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8rxdq" podStartSLOduration=5.725324951 podStartE2EDuration="10.70249676s" podCreationTimestamp="2025-10-05 20:27:48 +0000 UTC" firstStartedPulling="2025-10-05 20:27:50.58671498 +0000 UTC m=+779.435043212" lastFinishedPulling="2025-10-05 20:27:55.563886789 +0000 UTC m=+784.412215021" observedRunningTime="2025-10-05 20:27:56.651988272 +0000 UTC m=+785.500316514" watchObservedRunningTime="2025-10-05 20:27:58.70249676 +0000 UTC m=+787.550825002" Oct 05 20:27:58 crc kubenswrapper[4753]: I1005 20:27:58.705664 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-5l45z" podStartSLOduration=1.52022822 podStartE2EDuration="8.705651718s" podCreationTimestamp="2025-10-05 20:27:50 +0000 UTC" firstStartedPulling="2025-10-05 20:27:50.882352755 +0000 UTC m=+779.730680987" lastFinishedPulling="2025-10-05 20:27:58.067776243 +0000 UTC m=+786.916104485" observedRunningTime="2025-10-05 20:27:58.700581191 +0000 UTC m=+787.548909473" watchObservedRunningTime="2025-10-05 20:27:58.705651718 +0000 UTC m=+787.553979970" Oct 05 20:27:59 crc kubenswrapper[4753]: I1005 20:27:59.064748 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:27:59 crc kubenswrapper[4753]: I1005 20:27:59.065002 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:27:59 crc kubenswrapper[4753]: I1005 20:27:59.128265 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:28:00 crc kubenswrapper[4753]: I1005 20:28:00.421518 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-677d5bb784-5l45z" Oct 05 20:28:00 crc kubenswrapper[4753]: I1005 20:28:00.719929 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:28:01 crc kubenswrapper[4753]: I1005 20:28:01.338529 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rxdq"] Oct 05 20:28:02 crc kubenswrapper[4753]: I1005 20:28:02.676396 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8rxdq" podUID="564930ae-a228-4e79-9378-7c307b70a4fd" containerName="registry-server" containerID="cri-o://0920951cabc037b9c7ecd9f3c83cb72a5afec8f4809afa320bfcf0f766458144" gracePeriod=2 Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.008998 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.067688 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/564930ae-a228-4e79-9378-7c307b70a4fd-utilities\") pod \"564930ae-a228-4e79-9378-7c307b70a4fd\" (UID: \"564930ae-a228-4e79-9378-7c307b70a4fd\") " Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.067938 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctxkq\" (UniqueName: \"kubernetes.io/projected/564930ae-a228-4e79-9378-7c307b70a4fd-kube-api-access-ctxkq\") pod \"564930ae-a228-4e79-9378-7c307b70a4fd\" (UID: \"564930ae-a228-4e79-9378-7c307b70a4fd\") " Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.067968 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/564930ae-a228-4e79-9378-7c307b70a4fd-catalog-content\") pod \"564930ae-a228-4e79-9378-7c307b70a4fd\" (UID: \"564930ae-a228-4e79-9378-7c307b70a4fd\") " Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.068770 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/564930ae-a228-4e79-9378-7c307b70a4fd-utilities" (OuterVolumeSpecName: "utilities") pod "564930ae-a228-4e79-9378-7c307b70a4fd" (UID: "564930ae-a228-4e79-9378-7c307b70a4fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.075529 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564930ae-a228-4e79-9378-7c307b70a4fd-kube-api-access-ctxkq" (OuterVolumeSpecName: "kube-api-access-ctxkq") pod "564930ae-a228-4e79-9378-7c307b70a4fd" (UID: "564930ae-a228-4e79-9378-7c307b70a4fd"). InnerVolumeSpecName "kube-api-access-ctxkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.122324 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/564930ae-a228-4e79-9378-7c307b70a4fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "564930ae-a228-4e79-9378-7c307b70a4fd" (UID: "564930ae-a228-4e79-9378-7c307b70a4fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.169273 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/564930ae-a228-4e79-9378-7c307b70a4fd-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.169301 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctxkq\" (UniqueName: \"kubernetes.io/projected/564930ae-a228-4e79-9378-7c307b70a4fd-kube-api-access-ctxkq\") on node \"crc\" DevicePath \"\"" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.169311 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/564930ae-a228-4e79-9378-7c307b70a4fd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.691396 4753 generic.go:334] "Generic (PLEG): container finished" podID="564930ae-a228-4e79-9378-7c307b70a4fd" containerID="0920951cabc037b9c7ecd9f3c83cb72a5afec8f4809afa320bfcf0f766458144" exitCode=0 Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.691453 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxdq" event={"ID":"564930ae-a228-4e79-9378-7c307b70a4fd","Type":"ContainerDied","Data":"0920951cabc037b9c7ecd9f3c83cb72a5afec8f4809afa320bfcf0f766458144"} Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.691480 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxdq" event={"ID":"564930ae-a228-4e79-9378-7c307b70a4fd","Type":"ContainerDied","Data":"4a57ef1912ad5293260b2d558c21dbe9882b407a812941154a5d4337ab17bb56"} Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.691496 4753 scope.go:117] "RemoveContainer" containerID="0920951cabc037b9c7ecd9f3c83cb72a5afec8f4809afa320bfcf0f766458144" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.691623 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rxdq" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.709866 4753 scope.go:117] "RemoveContainer" containerID="0a0de94783c58a56fc0bdd9befa5eca24a47792ab10b98814ff795299c43fbe9" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.738475 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rxdq"] Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.744169 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8rxdq"] Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.754186 4753 scope.go:117] "RemoveContainer" containerID="316a2819408df0ec54a7b98b491ba6800294d08ebdcb924bcd9d31f63babb3f0" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.770494 4753 scope.go:117] "RemoveContainer" containerID="0920951cabc037b9c7ecd9f3c83cb72a5afec8f4809afa320bfcf0f766458144" Oct 05 20:28:03 crc kubenswrapper[4753]: E1005 20:28:03.770954 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0920951cabc037b9c7ecd9f3c83cb72a5afec8f4809afa320bfcf0f766458144\": container with ID starting with 0920951cabc037b9c7ecd9f3c83cb72a5afec8f4809afa320bfcf0f766458144 not found: ID does not exist" containerID="0920951cabc037b9c7ecd9f3c83cb72a5afec8f4809afa320bfcf0f766458144" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.771025 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0920951cabc037b9c7ecd9f3c83cb72a5afec8f4809afa320bfcf0f766458144"} err="failed to get container status \"0920951cabc037b9c7ecd9f3c83cb72a5afec8f4809afa320bfcf0f766458144\": rpc error: code = NotFound desc = could not find container \"0920951cabc037b9c7ecd9f3c83cb72a5afec8f4809afa320bfcf0f766458144\": container with ID starting with 0920951cabc037b9c7ecd9f3c83cb72a5afec8f4809afa320bfcf0f766458144 not found: ID does not exist" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.771047 4753 scope.go:117] "RemoveContainer" containerID="0a0de94783c58a56fc0bdd9befa5eca24a47792ab10b98814ff795299c43fbe9" Oct 05 20:28:03 crc kubenswrapper[4753]: E1005 20:28:03.772018 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a0de94783c58a56fc0bdd9befa5eca24a47792ab10b98814ff795299c43fbe9\": container with ID starting with 0a0de94783c58a56fc0bdd9befa5eca24a47792ab10b98814ff795299c43fbe9 not found: ID does not exist" containerID="0a0de94783c58a56fc0bdd9befa5eca24a47792ab10b98814ff795299c43fbe9" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.772071 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a0de94783c58a56fc0bdd9befa5eca24a47792ab10b98814ff795299c43fbe9"} err="failed to get container status \"0a0de94783c58a56fc0bdd9befa5eca24a47792ab10b98814ff795299c43fbe9\": rpc error: code = NotFound desc = could not find container \"0a0de94783c58a56fc0bdd9befa5eca24a47792ab10b98814ff795299c43fbe9\": container with ID starting with 0a0de94783c58a56fc0bdd9befa5eca24a47792ab10b98814ff795299c43fbe9 not found: ID does not exist" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.772097 4753 scope.go:117] "RemoveContainer" containerID="316a2819408df0ec54a7b98b491ba6800294d08ebdcb924bcd9d31f63babb3f0" Oct 05 20:28:03 crc kubenswrapper[4753]: E1005 20:28:03.772483 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316a2819408df0ec54a7b98b491ba6800294d08ebdcb924bcd9d31f63babb3f0\": container with ID starting with 316a2819408df0ec54a7b98b491ba6800294d08ebdcb924bcd9d31f63babb3f0 not found: ID does not exist" containerID="316a2819408df0ec54a7b98b491ba6800294d08ebdcb924bcd9d31f63babb3f0" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.772533 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316a2819408df0ec54a7b98b491ba6800294d08ebdcb924bcd9d31f63babb3f0"} err="failed to get container status \"316a2819408df0ec54a7b98b491ba6800294d08ebdcb924bcd9d31f63babb3f0\": rpc error: code = NotFound desc = could not find container \"316a2819408df0ec54a7b98b491ba6800294d08ebdcb924bcd9d31f63babb3f0\": container with ID starting with 316a2819408df0ec54a7b98b491ba6800294d08ebdcb924bcd9d31f63babb3f0 not found: ID does not exist" Oct 05 20:28:03 crc kubenswrapper[4753]: I1005 20:28:03.863064 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564930ae-a228-4e79-9378-7c307b70a4fd" path="/var/lib/kubelet/pods/564930ae-a228-4e79-9378-7c307b70a4fd/volumes" Oct 05 20:28:04 crc kubenswrapper[4753]: I1005 20:28:04.489653 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:28:04 crc kubenswrapper[4753]: I1005 20:28:04.489906 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:28:04 crc kubenswrapper[4753]: I1005 20:28:04.489977 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:28:04 crc kubenswrapper[4753]: I1005 20:28:04.490542 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52a63543bb254f48756b18656816abfb4df8b41bf216da0e9c35cd4d17058bd4"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 20:28:04 crc kubenswrapper[4753]: I1005 20:28:04.490674 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://52a63543bb254f48756b18656816abfb4df8b41bf216da0e9c35cd4d17058bd4" gracePeriod=600 Oct 05 20:28:04 crc kubenswrapper[4753]: I1005 20:28:04.701716 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="52a63543bb254f48756b18656816abfb4df8b41bf216da0e9c35cd4d17058bd4" exitCode=0 Oct 05 20:28:04 crc kubenswrapper[4753]: I1005 20:28:04.701800 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"52a63543bb254f48756b18656816abfb4df8b41bf216da0e9c35cd4d17058bd4"} Oct 05 20:28:04 crc kubenswrapper[4753]: I1005 20:28:04.701838 4753 scope.go:117] "RemoveContainer" containerID="5ebcd29664463350c05d6256aae98be7654b14c206df5bbe017e8126dff22fad" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.344962 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5bkds"] Oct 05 20:28:05 crc kubenswrapper[4753]: E1005 20:28:05.345481 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564930ae-a228-4e79-9378-7c307b70a4fd" containerName="extract-utilities" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.345491 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="564930ae-a228-4e79-9378-7c307b70a4fd" containerName="extract-utilities" Oct 05 20:28:05 crc kubenswrapper[4753]: E1005 20:28:05.345502 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564930ae-a228-4e79-9378-7c307b70a4fd" containerName="registry-server" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.345508 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="564930ae-a228-4e79-9378-7c307b70a4fd" containerName="registry-server" Oct 05 20:28:05 crc kubenswrapper[4753]: E1005 20:28:05.345532 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564930ae-a228-4e79-9378-7c307b70a4fd" containerName="extract-content" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.345538 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="564930ae-a228-4e79-9378-7c307b70a4fd" containerName="extract-content" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.345638 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="564930ae-a228-4e79-9378-7c307b70a4fd" containerName="registry-server" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.346400 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.395571 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81397f28-bb39-4bfc-97dc-ec850c6671a8-utilities\") pod \"community-operators-5bkds\" (UID: \"81397f28-bb39-4bfc-97dc-ec850c6671a8\") " pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.395627 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nsn8\" (UniqueName: \"kubernetes.io/projected/81397f28-bb39-4bfc-97dc-ec850c6671a8-kube-api-access-8nsn8\") pod \"community-operators-5bkds\" (UID: \"81397f28-bb39-4bfc-97dc-ec850c6671a8\") " pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.395658 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81397f28-bb39-4bfc-97dc-ec850c6671a8-catalog-content\") pod \"community-operators-5bkds\" (UID: \"81397f28-bb39-4bfc-97dc-ec850c6671a8\") " pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.402332 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5bkds"] Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.497379 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nsn8\" (UniqueName: \"kubernetes.io/projected/81397f28-bb39-4bfc-97dc-ec850c6671a8-kube-api-access-8nsn8\") pod \"community-operators-5bkds\" (UID: \"81397f28-bb39-4bfc-97dc-ec850c6671a8\") " pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.497431 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81397f28-bb39-4bfc-97dc-ec850c6671a8-catalog-content\") pod \"community-operators-5bkds\" (UID: \"81397f28-bb39-4bfc-97dc-ec850c6671a8\") " pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.497497 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81397f28-bb39-4bfc-97dc-ec850c6671a8-utilities\") pod \"community-operators-5bkds\" (UID: \"81397f28-bb39-4bfc-97dc-ec850c6671a8\") " pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.497962 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81397f28-bb39-4bfc-97dc-ec850c6671a8-utilities\") pod \"community-operators-5bkds\" (UID: \"81397f28-bb39-4bfc-97dc-ec850c6671a8\") " pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.498034 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81397f28-bb39-4bfc-97dc-ec850c6671a8-catalog-content\") pod \"community-operators-5bkds\" (UID: \"81397f28-bb39-4bfc-97dc-ec850c6671a8\") " pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.529084 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nsn8\" (UniqueName: \"kubernetes.io/projected/81397f28-bb39-4bfc-97dc-ec850c6671a8-kube-api-access-8nsn8\") pod \"community-operators-5bkds\" (UID: \"81397f28-bb39-4bfc-97dc-ec850c6671a8\") " pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.659413 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:05 crc kubenswrapper[4753]: I1005 20:28:05.711977 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"4bd1799dc562bea716f28381e5b80e668ae3afe82aa4afc891b52d8b6b3b6337"} Oct 05 20:28:06 crc kubenswrapper[4753]: I1005 20:28:06.217283 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5bkds"] Oct 05 20:28:06 crc kubenswrapper[4753]: W1005 20:28:06.224714 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81397f28_bb39_4bfc_97dc_ec850c6671a8.slice/crio-5608acc3f803d7c9cad2cbe52e89fcb46e73c5387ee94dfb220fa4875f778d4d WatchSource:0}: Error finding container 5608acc3f803d7c9cad2cbe52e89fcb46e73c5387ee94dfb220fa4875f778d4d: Status 404 returned error can't find the container with id 5608acc3f803d7c9cad2cbe52e89fcb46e73c5387ee94dfb220fa4875f778d4d Oct 05 20:28:06 crc kubenswrapper[4753]: I1005 20:28:06.718063 4753 generic.go:334] "Generic (PLEG): container finished" podID="81397f28-bb39-4bfc-97dc-ec850c6671a8" containerID="e531d7fd60e72d8a444dd23ddc0a261697ff3397136ccd68f930d91b3b96704c" exitCode=0 Oct 05 20:28:06 crc kubenswrapper[4753]: I1005 20:28:06.718126 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bkds" event={"ID":"81397f28-bb39-4bfc-97dc-ec850c6671a8","Type":"ContainerDied","Data":"e531d7fd60e72d8a444dd23ddc0a261697ff3397136ccd68f930d91b3b96704c"} Oct 05 20:28:06 crc kubenswrapper[4753]: I1005 20:28:06.718222 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bkds" event={"ID":"81397f28-bb39-4bfc-97dc-ec850c6671a8","Type":"ContainerStarted","Data":"5608acc3f803d7c9cad2cbe52e89fcb46e73c5387ee94dfb220fa4875f778d4d"} Oct 05 20:28:07 crc kubenswrapper[4753]: I1005 20:28:07.724643 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bkds" event={"ID":"81397f28-bb39-4bfc-97dc-ec850c6671a8","Type":"ContainerStarted","Data":"9cfcc43cb38e6ef8f9a50a71163b8375a35050db3ea71cc94114cd1b06d2195e"} Oct 05 20:28:08 crc kubenswrapper[4753]: I1005 20:28:08.343445 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gqlbb"] Oct 05 20:28:08 crc kubenswrapper[4753]: I1005 20:28:08.344681 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:08 crc kubenswrapper[4753]: I1005 20:28:08.356782 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gqlbb"] Oct 05 20:28:08 crc kubenswrapper[4753]: I1005 20:28:08.440772 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bsvx\" (UniqueName: \"kubernetes.io/projected/92664ac3-02b4-4557-b7d4-2c638b4c082f-kube-api-access-8bsvx\") pod \"redhat-operators-gqlbb\" (UID: \"92664ac3-02b4-4557-b7d4-2c638b4c082f\") " pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:08 crc kubenswrapper[4753]: I1005 20:28:08.441046 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92664ac3-02b4-4557-b7d4-2c638b4c082f-catalog-content\") pod \"redhat-operators-gqlbb\" (UID: \"92664ac3-02b4-4557-b7d4-2c638b4c082f\") " pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:08 crc kubenswrapper[4753]: I1005 20:28:08.441180 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92664ac3-02b4-4557-b7d4-2c638b4c082f-utilities\") pod \"redhat-operators-gqlbb\" (UID: \"92664ac3-02b4-4557-b7d4-2c638b4c082f\") " pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:08 crc kubenswrapper[4753]: I1005 20:28:08.542340 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bsvx\" (UniqueName: \"kubernetes.io/projected/92664ac3-02b4-4557-b7d4-2c638b4c082f-kube-api-access-8bsvx\") pod \"redhat-operators-gqlbb\" (UID: \"92664ac3-02b4-4557-b7d4-2c638b4c082f\") " pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:08 crc kubenswrapper[4753]: I1005 20:28:08.542396 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92664ac3-02b4-4557-b7d4-2c638b4c082f-catalog-content\") pod \"redhat-operators-gqlbb\" (UID: \"92664ac3-02b4-4557-b7d4-2c638b4c082f\") " pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:08 crc kubenswrapper[4753]: I1005 20:28:08.542437 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92664ac3-02b4-4557-b7d4-2c638b4c082f-utilities\") pod \"redhat-operators-gqlbb\" (UID: \"92664ac3-02b4-4557-b7d4-2c638b4c082f\") " pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:08 crc kubenswrapper[4753]: I1005 20:28:08.542906 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92664ac3-02b4-4557-b7d4-2c638b4c082f-utilities\") pod \"redhat-operators-gqlbb\" (UID: \"92664ac3-02b4-4557-b7d4-2c638b4c082f\") " pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:08 crc kubenswrapper[4753]: I1005 20:28:08.543410 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92664ac3-02b4-4557-b7d4-2c638b4c082f-catalog-content\") pod \"redhat-operators-gqlbb\" (UID: \"92664ac3-02b4-4557-b7d4-2c638b4c082f\") " pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:08 crc kubenswrapper[4753]: I1005 20:28:08.564358 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bsvx\" (UniqueName: \"kubernetes.io/projected/92664ac3-02b4-4557-b7d4-2c638b4c082f-kube-api-access-8bsvx\") pod \"redhat-operators-gqlbb\" (UID: \"92664ac3-02b4-4557-b7d4-2c638b4c082f\") " pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:08 crc kubenswrapper[4753]: I1005 20:28:08.657461 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:08 crc kubenswrapper[4753]: I1005 20:28:08.737779 4753 generic.go:334] "Generic (PLEG): container finished" podID="81397f28-bb39-4bfc-97dc-ec850c6671a8" containerID="9cfcc43cb38e6ef8f9a50a71163b8375a35050db3ea71cc94114cd1b06d2195e" exitCode=0 Oct 05 20:28:08 crc kubenswrapper[4753]: I1005 20:28:08.737814 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bkds" event={"ID":"81397f28-bb39-4bfc-97dc-ec850c6671a8","Type":"ContainerDied","Data":"9cfcc43cb38e6ef8f9a50a71163b8375a35050db3ea71cc94114cd1b06d2195e"} Oct 05 20:28:09 crc kubenswrapper[4753]: I1005 20:28:09.067602 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gqlbb"] Oct 05 20:28:09 crc kubenswrapper[4753]: I1005 20:28:09.744874 4753 generic.go:334] "Generic (PLEG): container finished" podID="92664ac3-02b4-4557-b7d4-2c638b4c082f" containerID="d4a1491d19705c8430c191101840116073c137fa678930cd6f4777933c65f0f8" exitCode=0 Oct 05 20:28:09 crc kubenswrapper[4753]: I1005 20:28:09.744995 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqlbb" event={"ID":"92664ac3-02b4-4557-b7d4-2c638b4c082f","Type":"ContainerDied","Data":"d4a1491d19705c8430c191101840116073c137fa678930cd6f4777933c65f0f8"} Oct 05 20:28:09 crc kubenswrapper[4753]: I1005 20:28:09.745271 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqlbb" event={"ID":"92664ac3-02b4-4557-b7d4-2c638b4c082f","Type":"ContainerStarted","Data":"31150ad5f732415c8d7ddefae1c8070b806eff0bfda4fc414c4efaac4c78363f"} Oct 05 20:28:09 crc kubenswrapper[4753]: I1005 20:28:09.749682 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bkds" event={"ID":"81397f28-bb39-4bfc-97dc-ec850c6671a8","Type":"ContainerStarted","Data":"0e90761f9bc938106c267c0fcb09bc4a50d5c207e25cadd037395d2dfb8e93ec"} Oct 05 20:28:09 crc kubenswrapper[4753]: I1005 20:28:09.785237 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5bkds" podStartSLOduration=2.368186118 podStartE2EDuration="4.785221869s" podCreationTimestamp="2025-10-05 20:28:05 +0000 UTC" firstStartedPulling="2025-10-05 20:28:06.719760296 +0000 UTC m=+795.568088528" lastFinishedPulling="2025-10-05 20:28:09.136796057 +0000 UTC m=+797.985124279" observedRunningTime="2025-10-05 20:28:09.783886588 +0000 UTC m=+798.632214830" watchObservedRunningTime="2025-10-05 20:28:09.785221869 +0000 UTC m=+798.633550101" Oct 05 20:28:10 crc kubenswrapper[4753]: I1005 20:28:10.759091 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqlbb" event={"ID":"92664ac3-02b4-4557-b7d4-2c638b4c082f","Type":"ContainerStarted","Data":"6c3a564d01e04e96354566e026e93644c270a8bf38a3aaa827d07bf101383fd5"} Oct 05 20:28:11 crc kubenswrapper[4753]: I1005 20:28:11.765003 4753 generic.go:334] "Generic (PLEG): container finished" podID="92664ac3-02b4-4557-b7d4-2c638b4c082f" containerID="6c3a564d01e04e96354566e026e93644c270a8bf38a3aaa827d07bf101383fd5" exitCode=0 Oct 05 20:28:11 crc kubenswrapper[4753]: I1005 20:28:11.765056 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqlbb" event={"ID":"92664ac3-02b4-4557-b7d4-2c638b4c082f","Type":"ContainerDied","Data":"6c3a564d01e04e96354566e026e93644c270a8bf38a3aaa827d07bf101383fd5"} Oct 05 20:28:12 crc kubenswrapper[4753]: I1005 20:28:12.773201 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqlbb" event={"ID":"92664ac3-02b4-4557-b7d4-2c638b4c082f","Type":"ContainerStarted","Data":"8c8380e9d964bcce409109dbe53f2348907c854c75248ae2c343ab887aebaca2"} Oct 05 20:28:12 crc kubenswrapper[4753]: I1005 20:28:12.794368 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gqlbb" podStartSLOduration=2.3465188 podStartE2EDuration="4.794352666s" podCreationTimestamp="2025-10-05 20:28:08 +0000 UTC" firstStartedPulling="2025-10-05 20:28:09.746265732 +0000 UTC m=+798.594593964" lastFinishedPulling="2025-10-05 20:28:12.194099598 +0000 UTC m=+801.042427830" observedRunningTime="2025-10-05 20:28:12.791115316 +0000 UTC m=+801.639443548" watchObservedRunningTime="2025-10-05 20:28:12.794352666 +0000 UTC m=+801.642680898" Oct 05 20:28:15 crc kubenswrapper[4753]: I1005 20:28:15.659620 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:15 crc kubenswrapper[4753]: I1005 20:28:15.660300 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:15 crc kubenswrapper[4753]: I1005 20:28:15.715995 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:15 crc kubenswrapper[4753]: I1005 20:28:15.848449 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:16 crc kubenswrapper[4753]: I1005 20:28:16.141097 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5bkds"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.382549 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5b974f6766-wsbjp"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.384016 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-wsbjp" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.387060 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-qd6mw" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.408760 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.409874 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.415511 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-rl65c" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.429544 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-4wqxw"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.430512 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-4wqxw" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.444173 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-9lh7x" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.462938 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2px58\" (UniqueName: \"kubernetes.io/projected/64896158-a10b-4fd9-b232-5ba3fa647a02-kube-api-access-2px58\") pod \"cinder-operator-controller-manager-84bd8f6848-p5g48\" (UID: \"64896158-a10b-4fd9-b232-5ba3fa647a02\") " pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.463233 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxv6f\" (UniqueName: \"kubernetes.io/projected/12ff014d-81e6-4a9e-8197-e28fbfc4a06e-kube-api-access-hxv6f\") pod \"barbican-operator-controller-manager-5b974f6766-wsbjp\" (UID: \"12ff014d-81e6-4a9e-8197-e28fbfc4a06e\") " pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-wsbjp" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.479308 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5b974f6766-wsbjp"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.491021 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5c497dbdb-txd4b"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.492463 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-txd4b" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.494792 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xw5b9" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.509021 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.510203 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.514457 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-2mvnp" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.519626 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.530353 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-4wqxw"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.540697 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.540791 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5c497dbdb-txd4b"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.540804 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.541974 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.556545 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-nghwz" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.564328 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2px58\" (UniqueName: \"kubernetes.io/projected/64896158-a10b-4fd9-b232-5ba3fa647a02-kube-api-access-2px58\") pod \"cinder-operator-controller-manager-84bd8f6848-p5g48\" (UID: \"64896158-a10b-4fd9-b232-5ba3fa647a02\") " pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.564419 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qxlr\" (UniqueName: \"kubernetes.io/projected/9286136d-f0a7-4488-b346-2b3ea3ab81da-kube-api-access-6qxlr\") pod \"designate-operator-controller-manager-58d86cd59d-4wqxw\" (UID: \"9286136d-f0a7-4488-b346-2b3ea3ab81da\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-4wqxw" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.564472 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5sfv\" (UniqueName: \"kubernetes.io/projected/5b8831b7-9250-4ec8-b732-2db04e507cfe-kube-api-access-t5sfv\") pod \"heat-operator-controller-manager-5c497dbdb-txd4b\" (UID: \"5b8831b7-9250-4ec8-b732-2db04e507cfe\") " pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-txd4b" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.564504 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxv6f\" (UniqueName: \"kubernetes.io/projected/12ff014d-81e6-4a9e-8197-e28fbfc4a06e-kube-api-access-hxv6f\") pod \"barbican-operator-controller-manager-5b974f6766-wsbjp\" (UID: \"12ff014d-81e6-4a9e-8197-e28fbfc4a06e\") " pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-wsbjp" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.564544 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkkth\" (UniqueName: \"kubernetes.io/projected/d8c88aaa-c54b-4f65-be07-61e23d5a5cd4-kube-api-access-gkkth\") pod \"glance-operator-controller-manager-698456cdc6-tnt6j\" (UID: \"d8c88aaa-c54b-4f65-be07-61e23d5a5cd4\") " pod="openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.568682 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.570196 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.577431 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.577754 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9vnkv" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.590778 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.598290 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.606024 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f5894c49f-ct2l6"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.607332 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-ct2l6" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.618192 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-drwrk" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.642673 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxv6f\" (UniqueName: \"kubernetes.io/projected/12ff014d-81e6-4a9e-8197-e28fbfc4a06e-kube-api-access-hxv6f\") pod \"barbican-operator-controller-manager-5b974f6766-wsbjp\" (UID: \"12ff014d-81e6-4a9e-8197-e28fbfc4a06e\") " pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-wsbjp" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.662099 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2px58\" (UniqueName: \"kubernetes.io/projected/64896158-a10b-4fd9-b232-5ba3fa647a02-kube-api-access-2px58\") pod \"cinder-operator-controller-manager-84bd8f6848-p5g48\" (UID: \"64896158-a10b-4fd9-b232-5ba3fa647a02\") " pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.667012 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qxlr\" (UniqueName: \"kubernetes.io/projected/9286136d-f0a7-4488-b346-2b3ea3ab81da-kube-api-access-6qxlr\") pod \"designate-operator-controller-manager-58d86cd59d-4wqxw\" (UID: \"9286136d-f0a7-4488-b346-2b3ea3ab81da\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-4wqxw" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.667069 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5sfv\" (UniqueName: \"kubernetes.io/projected/5b8831b7-9250-4ec8-b732-2db04e507cfe-kube-api-access-t5sfv\") pod \"heat-operator-controller-manager-5c497dbdb-txd4b\" (UID: \"5b8831b7-9250-4ec8-b732-2db04e507cfe\") " pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-txd4b" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.667102 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkkth\" (UniqueName: \"kubernetes.io/projected/d8c88aaa-c54b-4f65-be07-61e23d5a5cd4-kube-api-access-gkkth\") pod \"glance-operator-controller-manager-698456cdc6-tnt6j\" (UID: \"d8c88aaa-c54b-4f65-be07-61e23d5a5cd4\") " pod="openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.667170 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z54tl\" (UniqueName: \"kubernetes.io/projected/266b0921-1164-46bc-9e78-986f5ded5943-kube-api-access-z54tl\") pod \"ironic-operator-controller-manager-6f5894c49f-ct2l6\" (UID: \"266b0921-1164-46bc-9e78-986f5ded5943\") " pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-ct2l6" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.667193 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kbkb\" (UniqueName: \"kubernetes.io/projected/f241e98d-8f7c-492a-a4bc-988dc78b6449-kube-api-access-7kbkb\") pod \"infra-operator-controller-manager-84788b6bc5-vksxs\" (UID: \"f241e98d-8f7c-492a-a4bc-988dc78b6449\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.667238 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h6gf\" (UniqueName: \"kubernetes.io/projected/885f705b-599d-41fe-92cf-ffd000ad5e6e-kube-api-access-8h6gf\") pod \"horizon-operator-controller-manager-6675647785-dqfcj\" (UID: \"885f705b-599d-41fe-92cf-ffd000ad5e6e\") " pod="openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.667268 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f241e98d-8f7c-492a-a4bc-988dc78b6449-cert\") pod \"infra-operator-controller-manager-84788b6bc5-vksxs\" (UID: \"f241e98d-8f7c-492a-a4bc-988dc78b6449\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.678843 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f5894c49f-ct2l6"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.688388 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.707999 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-wsbjp" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.709310 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.726704 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkkth\" (UniqueName: \"kubernetes.io/projected/d8c88aaa-c54b-4f65-be07-61e23d5a5cd4-kube-api-access-gkkth\") pod \"glance-operator-controller-manager-698456cdc6-tnt6j\" (UID: \"d8c88aaa-c54b-4f65-be07-61e23d5a5cd4\") " pod="openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.735257 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.750283 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5sfv\" (UniqueName: \"kubernetes.io/projected/5b8831b7-9250-4ec8-b732-2db04e507cfe-kube-api-access-t5sfv\") pod \"heat-operator-controller-manager-5c497dbdb-txd4b\" (UID: \"5b8831b7-9250-4ec8-b732-2db04e507cfe\") " pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-txd4b" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.751011 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-tsfcz" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.753811 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qxlr\" (UniqueName: \"kubernetes.io/projected/9286136d-f0a7-4488-b346-2b3ea3ab81da-kube-api-access-6qxlr\") pod \"designate-operator-controller-manager-58d86cd59d-4wqxw\" (UID: \"9286136d-f0a7-4488-b346-2b3ea3ab81da\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-4wqxw" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.764788 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-4wqxw" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.776153 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlp7x\" (UniqueName: \"kubernetes.io/projected/dd3487ac-89f8-40f1-967e-71f7fada0fe1-kube-api-access-qlp7x\") pod \"keystone-operator-controller-manager-57c9cdcf57-9kjpf\" (UID: \"dd3487ac-89f8-40f1-967e-71f7fada0fe1\") " pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.776253 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z54tl\" (UniqueName: \"kubernetes.io/projected/266b0921-1164-46bc-9e78-986f5ded5943-kube-api-access-z54tl\") pod \"ironic-operator-controller-manager-6f5894c49f-ct2l6\" (UID: \"266b0921-1164-46bc-9e78-986f5ded5943\") " pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-ct2l6" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.776291 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kbkb\" (UniqueName: \"kubernetes.io/projected/f241e98d-8f7c-492a-a4bc-988dc78b6449-kube-api-access-7kbkb\") pod \"infra-operator-controller-manager-84788b6bc5-vksxs\" (UID: \"f241e98d-8f7c-492a-a4bc-988dc78b6449\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.776321 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h6gf\" (UniqueName: \"kubernetes.io/projected/885f705b-599d-41fe-92cf-ffd000ad5e6e-kube-api-access-8h6gf\") pod \"horizon-operator-controller-manager-6675647785-dqfcj\" (UID: \"885f705b-599d-41fe-92cf-ffd000ad5e6e\") " pod="openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.776351 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f241e98d-8f7c-492a-a4bc-988dc78b6449-cert\") pod \"infra-operator-controller-manager-84788b6bc5-vksxs\" (UID: \"f241e98d-8f7c-492a-a4bc-988dc78b6449\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" Oct 05 20:28:17 crc kubenswrapper[4753]: E1005 20:28:17.776605 4753 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 05 20:28:17 crc kubenswrapper[4753]: E1005 20:28:17.776673 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f241e98d-8f7c-492a-a4bc-988dc78b6449-cert podName:f241e98d-8f7c-492a-a4bc-988dc78b6449 nodeName:}" failed. No retries permitted until 2025-10-05 20:28:18.276651844 +0000 UTC m=+807.124980076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f241e98d-8f7c-492a-a4bc-988dc78b6449-cert") pod "infra-operator-controller-manager-84788b6bc5-vksxs" (UID: "f241e98d-8f7c-492a-a4bc-988dc78b6449") : secret "infra-operator-webhook-server-cert" not found Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.787025 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.853707 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kbkb\" (UniqueName: \"kubernetes.io/projected/f241e98d-8f7c-492a-a4bc-988dc78b6449-kube-api-access-7kbkb\") pod \"infra-operator-controller-manager-84788b6bc5-vksxs\" (UID: \"f241e98d-8f7c-492a-a4bc-988dc78b6449\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.865916 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-txd4b" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.870188 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.870323 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.874720 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.877646 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlp7x\" (UniqueName: \"kubernetes.io/projected/dd3487ac-89f8-40f1-967e-71f7fada0fe1-kube-api-access-qlp7x\") pod \"keystone-operator-controller-manager-57c9cdcf57-9kjpf\" (UID: \"dd3487ac-89f8-40f1-967e-71f7fada0fe1\") " pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.877790 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfzkr\" (UniqueName: \"kubernetes.io/projected/0c5e8f9b-e10e-436b-ae33-07a7350f02a1-kube-api-access-xfzkr\") pod \"manila-operator-controller-manager-7cb48dbc-f4dp5\" (UID: \"0c5e8f9b-e10e-436b-ae33-07a7350f02a1\") " pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.897394 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z54tl\" (UniqueName: \"kubernetes.io/projected/266b0921-1164-46bc-9e78-986f5ded5943-kube-api-access-z54tl\") pod \"ironic-operator-controller-manager-6f5894c49f-ct2l6\" (UID: \"266b0921-1164-46bc-9e78-986f5ded5943\") " pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-ct2l6" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.918760 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-p9f5g" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.921768 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5bkds" podUID="81397f28-bb39-4bfc-97dc-ec850c6671a8" containerName="registry-server" containerID="cri-o://0e90761f9bc938106c267c0fcb09bc4a50d5c207e25cadd037395d2dfb8e93ec" gracePeriod=2 Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.924522 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h6gf\" (UniqueName: \"kubernetes.io/projected/885f705b-599d-41fe-92cf-ffd000ad5e6e-kube-api-access-8h6gf\") pod \"horizon-operator-controller-manager-6675647785-dqfcj\" (UID: \"885f705b-599d-41fe-92cf-ffd000ad5e6e\") " pod="openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.941261 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlp7x\" (UniqueName: \"kubernetes.io/projected/dd3487ac-89f8-40f1-967e-71f7fada0fe1-kube-api-access-qlp7x\") pod \"keystone-operator-controller-manager-57c9cdcf57-9kjpf\" (UID: \"dd3487ac-89f8-40f1-967e-71f7fada0fe1\") " pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.979634 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfzkr\" (UniqueName: \"kubernetes.io/projected/0c5e8f9b-e10e-436b-ae33-07a7350f02a1-kube-api-access-xfzkr\") pod \"manila-operator-controller-manager-7cb48dbc-f4dp5\" (UID: \"0c5e8f9b-e10e-436b-ae33-07a7350f02a1\") " pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.985528 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-ct2l6" Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.998530 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5"] Oct 05 20:28:17 crc kubenswrapper[4753]: I1005 20:28:17.998784 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-rlsgp"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.001452 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-rlsgp"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.001544 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-rlsgp" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.008312 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-24ngg" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.032688 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.033818 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.040144 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-6c9b57c67-cp4qf"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.041324 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-v82wz" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.041441 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-cp4qf" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.046523 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-8x4vz" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.050613 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfzkr\" (UniqueName: \"kubernetes.io/projected/0c5e8f9b-e10e-436b-ae33-07a7350f02a1-kube-api-access-xfzkr\") pod \"manila-operator-controller-manager-7cb48dbc-f4dp5\" (UID: \"0c5e8f9b-e10e-436b-ae33-07a7350f02a1\") " pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.052793 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.085186 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-6c9b57c67-cp4qf"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.088170 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgnqn\" (UniqueName: \"kubernetes.io/projected/9c8b9aa1-e15e-475d-a02e-56b430d50bd1-kube-api-access-sgnqn\") pod \"neutron-operator-controller-manager-69b956fbf6-vtgfd\" (UID: \"9c8b9aa1-e15e-475d-a02e-56b430d50bd1\") " pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.088369 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dv9p\" (UniqueName: \"kubernetes.io/projected/00eefbb7-989e-478d-aad3-ff4d236168f2-kube-api-access-9dv9p\") pod \"mariadb-operator-controller-manager-d6c9dc5bc-rlsgp\" (UID: \"00eefbb7-989e-478d-aad3-ff4d236168f2\") " pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-rlsgp" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.088560 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nmfc\" (UniqueName: \"kubernetes.io/projected/3e514d87-9323-4c3b-a372-60e5c65fa731-kube-api-access-4nmfc\") pod \"nova-operator-controller-manager-6c9b57c67-cp4qf\" (UID: \"3e514d87-9323-4c3b-a372-60e5c65fa731\") " pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-cp4qf" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.108062 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f59f9d8-htstg"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.109248 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-htstg" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.118905 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-f4fzq" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.154201 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f59f9d8-htstg"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.173540 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.174730 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.177799 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.192462 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcqkv\" (UniqueName: \"kubernetes.io/projected/d58d3fcd-368b-4d73-8c29-a181f3bdddee-kube-api-access-zcqkv\") pod \"octavia-operator-controller-manager-69f59f9d8-htstg\" (UID: \"d58d3fcd-368b-4d73-8c29-a181f3bdddee\") " pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-htstg" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.192525 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nmfc\" (UniqueName: \"kubernetes.io/projected/3e514d87-9323-4c3b-a372-60e5c65fa731-kube-api-access-4nmfc\") pod \"nova-operator-controller-manager-6c9b57c67-cp4qf\" (UID: \"3e514d87-9323-4c3b-a372-60e5c65fa731\") " pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-cp4qf" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.192556 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2flq\" (UniqueName: \"kubernetes.io/projected/28d42154-af7b-440b-af1b-2ef50ee9edca-kube-api-access-v2flq\") pod \"openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm\" (UID: \"28d42154-af7b-440b-af1b-2ef50ee9edca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.192586 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgnqn\" (UniqueName: \"kubernetes.io/projected/9c8b9aa1-e15e-475d-a02e-56b430d50bd1-kube-api-access-sgnqn\") pod \"neutron-operator-controller-manager-69b956fbf6-vtgfd\" (UID: \"9c8b9aa1-e15e-475d-a02e-56b430d50bd1\") " pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.192630 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28d42154-af7b-440b-af1b-2ef50ee9edca-cert\") pod \"openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm\" (UID: \"28d42154-af7b-440b-af1b-2ef50ee9edca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.192663 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dv9p\" (UniqueName: \"kubernetes.io/projected/00eefbb7-989e-478d-aad3-ff4d236168f2-kube-api-access-9dv9p\") pod \"mariadb-operator-controller-manager-d6c9dc5bc-rlsgp\" (UID: \"00eefbb7-989e-478d-aad3-ff4d236168f2\") " pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-rlsgp" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.209445 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.225937 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-dls52" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.239980 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.252246 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.261207 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-c968bb45-xsjn9"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.262400 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-c968bb45-xsjn9" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.265774 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dv9p\" (UniqueName: \"kubernetes.io/projected/00eefbb7-989e-478d-aad3-ff4d236168f2-kube-api-access-9dv9p\") pod \"mariadb-operator-controller-manager-d6c9dc5bc-rlsgp\" (UID: \"00eefbb7-989e-478d-aad3-ff4d236168f2\") " pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-rlsgp" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.266184 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nmfc\" (UniqueName: \"kubernetes.io/projected/3e514d87-9323-4c3b-a372-60e5c65fa731-kube-api-access-4nmfc\") pod \"nova-operator-controller-manager-6c9b57c67-cp4qf\" (UID: \"3e514d87-9323-4c3b-a372-60e5c65fa731\") " pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-cp4qf" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.266605 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ghb7b" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.285056 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgnqn\" (UniqueName: \"kubernetes.io/projected/9c8b9aa1-e15e-475d-a02e-56b430d50bd1-kube-api-access-sgnqn\") pod \"neutron-operator-controller-manager-69b956fbf6-vtgfd\" (UID: \"9c8b9aa1-e15e-475d-a02e-56b430d50bd1\") " pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.299863 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28d42154-af7b-440b-af1b-2ef50ee9edca-cert\") pod \"openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm\" (UID: \"28d42154-af7b-440b-af1b-2ef50ee9edca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.299950 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f28ck\" (UniqueName: \"kubernetes.io/projected/8ed6d37f-576a-4f14-a98a-65193559d7de-kube-api-access-f28ck\") pod \"ovn-operator-controller-manager-c968bb45-xsjn9\" (UID: \"8ed6d37f-576a-4f14-a98a-65193559d7de\") " pod="openstack-operators/ovn-operator-controller-manager-c968bb45-xsjn9" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.299993 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f241e98d-8f7c-492a-a4bc-988dc78b6449-cert\") pod \"infra-operator-controller-manager-84788b6bc5-vksxs\" (UID: \"f241e98d-8f7c-492a-a4bc-988dc78b6449\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.300026 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcqkv\" (UniqueName: \"kubernetes.io/projected/d58d3fcd-368b-4d73-8c29-a181f3bdddee-kube-api-access-zcqkv\") pod \"octavia-operator-controller-manager-69f59f9d8-htstg\" (UID: \"d58d3fcd-368b-4d73-8c29-a181f3bdddee\") " pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-htstg" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.300110 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2flq\" (UniqueName: \"kubernetes.io/projected/28d42154-af7b-440b-af1b-2ef50ee9edca-kube-api-access-v2flq\") pod \"openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm\" (UID: \"28d42154-af7b-440b-af1b-2ef50ee9edca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" Oct 05 20:28:18 crc kubenswrapper[4753]: E1005 20:28:18.300558 4753 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 05 20:28:18 crc kubenswrapper[4753]: E1005 20:28:18.300599 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f241e98d-8f7c-492a-a4bc-988dc78b6449-cert podName:f241e98d-8f7c-492a-a4bc-988dc78b6449 nodeName:}" failed. No retries permitted until 2025-10-05 20:28:19.300585167 +0000 UTC m=+808.148913399 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f241e98d-8f7c-492a-a4bc-988dc78b6449-cert") pod "infra-operator-controller-manager-84788b6bc5-vksxs" (UID: "f241e98d-8f7c-492a-a4bc-988dc78b6449") : secret "infra-operator-webhook-server-cert" not found Oct 05 20:28:18 crc kubenswrapper[4753]: E1005 20:28:18.300770 4753 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 05 20:28:18 crc kubenswrapper[4753]: E1005 20:28:18.300794 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28d42154-af7b-440b-af1b-2ef50ee9edca-cert podName:28d42154-af7b-440b-af1b-2ef50ee9edca nodeName:}" failed. No retries permitted until 2025-10-05 20:28:18.800787153 +0000 UTC m=+807.649115375 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/28d42154-af7b-440b-af1b-2ef50ee9edca-cert") pod "openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" (UID: "28d42154-af7b-440b-af1b-2ef50ee9edca") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.354945 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-66f6d6849b-9dsbr"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.355949 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-9dsbr" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.359410 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-rlsgp" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.370955 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.381253 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.392915 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-6bbcw" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.393438 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-cp4qf" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.401474 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvh8v\" (UniqueName: \"kubernetes.io/projected/ff1a796b-8cc7-4c73-842f-7b4a1170b56f-kube-api-access-hvh8v\") pod \"placement-operator-controller-manager-66f6d6849b-9dsbr\" (UID: \"ff1a796b-8cc7-4c73-842f-7b4a1170b56f\") " pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-9dsbr" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.401579 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f28ck\" (UniqueName: \"kubernetes.io/projected/8ed6d37f-576a-4f14-a98a-65193559d7de-kube-api-access-f28ck\") pod \"ovn-operator-controller-manager-c968bb45-xsjn9\" (UID: \"8ed6d37f-576a-4f14-a98a-65193559d7de\") " pod="openstack-operators/ovn-operator-controller-manager-c968bb45-xsjn9" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.405655 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2flq\" (UniqueName: \"kubernetes.io/projected/28d42154-af7b-440b-af1b-2ef50ee9edca-kube-api-access-v2flq\") pod \"openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm\" (UID: \"28d42154-af7b-440b-af1b-2ef50ee9edca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.410226 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcqkv\" (UniqueName: \"kubernetes.io/projected/d58d3fcd-368b-4d73-8c29-a181f3bdddee-kube-api-access-zcqkv\") pod \"octavia-operator-controller-manager-69f59f9d8-htstg\" (UID: \"d58d3fcd-368b-4d73-8c29-a181f3bdddee\") " pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-htstg" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.452226 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-c968bb45-xsjn9"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.463823 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f28ck\" (UniqueName: \"kubernetes.io/projected/8ed6d37f-576a-4f14-a98a-65193559d7de-kube-api-access-f28ck\") pod \"ovn-operator-controller-manager-c968bb45-xsjn9\" (UID: \"8ed6d37f-576a-4f14-a98a-65193559d7de\") " pod="openstack-operators/ovn-operator-controller-manager-c968bb45-xsjn9" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.502750 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvh8v\" (UniqueName: \"kubernetes.io/projected/ff1a796b-8cc7-4c73-842f-7b4a1170b56f-kube-api-access-hvh8v\") pod \"placement-operator-controller-manager-66f6d6849b-9dsbr\" (UID: \"ff1a796b-8cc7-4c73-842f-7b4a1170b56f\") " pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-9dsbr" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.506763 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-66f6d6849b-9dsbr"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.511398 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-4zhzq"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.512769 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-76d5577b-4zhzq" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.519095 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.520193 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.526555 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-htstg" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.533691 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-4w8pr" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.533899 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.534913 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.555302 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-4zhzq"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.571947 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-dcmhz" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.572197 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-brb6r" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.586699 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.621500 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k49lz\" (UniqueName: \"kubernetes.io/projected/82db7b73-2afb-4063-9d64-fc3fa5559e93-kube-api-access-k49lz\") pod \"telemetry-operator-controller-manager-f589c7597-58qhn\" (UID: \"82db7b73-2afb-4063-9d64-fc3fa5559e93\") " pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.621547 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drb6t\" (UniqueName: \"kubernetes.io/projected/995eda80-87fa-4160-b04e-679668f8d910-kube-api-access-drb6t\") pod \"swift-operator-controller-manager-76d5577b-4zhzq\" (UID: \"995eda80-87fa-4160-b04e-679668f8d910\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-4zhzq" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.621633 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbpq4\" (UniqueName: \"kubernetes.io/projected/96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3-kube-api-access-pbpq4\") pod \"watcher-operator-controller-manager-5d98cc5575-t99zs\" (UID: \"96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3\") " pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.624545 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-c968bb45-xsjn9" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.647645 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvh8v\" (UniqueName: \"kubernetes.io/projected/ff1a796b-8cc7-4c73-842f-7b4a1170b56f-kube-api-access-hvh8v\") pod \"placement-operator-controller-manager-66f6d6849b-9dsbr\" (UID: \"ff1a796b-8cc7-4c73-842f-7b4a1170b56f\") " pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-9dsbr" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.660482 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.660882 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.712303 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-zqgbl"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.714329 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-zqgbl" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.735993 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-2sxtw" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.737186 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbpq4\" (UniqueName: \"kubernetes.io/projected/96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3-kube-api-access-pbpq4\") pod \"watcher-operator-controller-manager-5d98cc5575-t99zs\" (UID: \"96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3\") " pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.737268 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k49lz\" (UniqueName: \"kubernetes.io/projected/82db7b73-2afb-4063-9d64-fc3fa5559e93-kube-api-access-k49lz\") pod \"telemetry-operator-controller-manager-f589c7597-58qhn\" (UID: \"82db7b73-2afb-4063-9d64-fc3fa5559e93\") " pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.737304 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drb6t\" (UniqueName: \"kubernetes.io/projected/995eda80-87fa-4160-b04e-679668f8d910-kube-api-access-drb6t\") pod \"swift-operator-controller-manager-76d5577b-4zhzq\" (UID: \"995eda80-87fa-4160-b04e-679668f8d910\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-4zhzq" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.762890 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-9dsbr" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.827259 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drb6t\" (UniqueName: \"kubernetes.io/projected/995eda80-87fa-4160-b04e-679668f8d910-kube-api-access-drb6t\") pod \"swift-operator-controller-manager-76d5577b-4zhzq\" (UID: \"995eda80-87fa-4160-b04e-679668f8d910\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-4zhzq" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.841254 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-zqgbl"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.844218 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdmxt\" (UniqueName: \"kubernetes.io/projected/8dd994e1-cb87-48dc-b844-2bdbc8b6e48d-kube-api-access-tdmxt\") pod \"test-operator-controller-manager-6bb6dcddc-zqgbl\" (UID: \"8dd994e1-cb87-48dc-b844-2bdbc8b6e48d\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-zqgbl" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.844396 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28d42154-af7b-440b-af1b-2ef50ee9edca-cert\") pod \"openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm\" (UID: \"28d42154-af7b-440b-af1b-2ef50ee9edca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" Oct 05 20:28:18 crc kubenswrapper[4753]: E1005 20:28:18.844729 4753 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 05 20:28:18 crc kubenswrapper[4753]: E1005 20:28:18.844863 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28d42154-af7b-440b-af1b-2ef50ee9edca-cert podName:28d42154-af7b-440b-af1b-2ef50ee9edca nodeName:}" failed. No retries permitted until 2025-10-05 20:28:19.84484856 +0000 UTC m=+808.693176792 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/28d42154-af7b-440b-af1b-2ef50ee9edca-cert") pod "openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" (UID: "28d42154-af7b-440b-af1b-2ef50ee9edca") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.850960 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k49lz\" (UniqueName: \"kubernetes.io/projected/82db7b73-2afb-4063-9d64-fc3fa5559e93-kube-api-access-k49lz\") pod \"telemetry-operator-controller-manager-f589c7597-58qhn\" (UID: \"82db7b73-2afb-4063-9d64-fc3fa5559e93\") " pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.872645 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs"] Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.873499 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-76d5577b-4zhzq" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.905428 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbpq4\" (UniqueName: \"kubernetes.io/projected/96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3-kube-api-access-pbpq4\") pod \"watcher-operator-controller-manager-5d98cc5575-t99zs\" (UID: \"96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3\") " pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.914675 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.924480 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.945799 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdmxt\" (UniqueName: \"kubernetes.io/projected/8dd994e1-cb87-48dc-b844-2bdbc8b6e48d-kube-api-access-tdmxt\") pod \"test-operator-controller-manager-6bb6dcddc-zqgbl\" (UID: \"8dd994e1-cb87-48dc-b844-2bdbc8b6e48d\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-zqgbl" Oct 05 20:28:18 crc kubenswrapper[4753]: I1005 20:28:18.959099 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r"] Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.016498 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r"] Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.016615 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.032727 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdmxt\" (UniqueName: \"kubernetes.io/projected/8dd994e1-cb87-48dc-b844-2bdbc8b6e48d-kube-api-access-tdmxt\") pod \"test-operator-controller-manager-6bb6dcddc-zqgbl\" (UID: \"8dd994e1-cb87-48dc-b844-2bdbc8b6e48d\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-zqgbl" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.033413 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-qqcn8" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.033603 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.066901 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr"] Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.067946 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.080928 4753 generic.go:334] "Generic (PLEG): container finished" podID="81397f28-bb39-4bfc-97dc-ec850c6671a8" containerID="0e90761f9bc938106c267c0fcb09bc4a50d5c207e25cadd037395d2dfb8e93ec" exitCode=0 Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.081114 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bkds" event={"ID":"81397f28-bb39-4bfc-97dc-ec850c6671a8","Type":"ContainerDied","Data":"0e90761f9bc938106c267c0fcb09bc4a50d5c207e25cadd037395d2dfb8e93ec"} Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.086034 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr"] Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.080941 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-dzxzk" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.108505 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-zqgbl" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.158287 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtwsk\" (UniqueName: \"kubernetes.io/projected/2d0279fb-be4d-47a0-83c7-4452c7b13a5b-kube-api-access-jtwsk\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr\" (UID: \"2d0279fb-be4d-47a0-83c7-4452c7b13a5b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.158326 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5-cert\") pod \"openstack-operator-controller-manager-7cfc658b9-4t94r\" (UID: \"d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5\") " pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.158373 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvkxr\" (UniqueName: \"kubernetes.io/projected/d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5-kube-api-access-rvkxr\") pod \"openstack-operator-controller-manager-7cfc658b9-4t94r\" (UID: \"d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5\") " pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.222568 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-4wqxw"] Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.267546 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtwsk\" (UniqueName: \"kubernetes.io/projected/2d0279fb-be4d-47a0-83c7-4452c7b13a5b-kube-api-access-jtwsk\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr\" (UID: \"2d0279fb-be4d-47a0-83c7-4452c7b13a5b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.267590 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5-cert\") pod \"openstack-operator-controller-manager-7cfc658b9-4t94r\" (UID: \"d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5\") " pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.267643 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvkxr\" (UniqueName: \"kubernetes.io/projected/d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5-kube-api-access-rvkxr\") pod \"openstack-operator-controller-manager-7cfc658b9-4t94r\" (UID: \"d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5\") " pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r" Oct 05 20:28:19 crc kubenswrapper[4753]: E1005 20:28:19.268106 4753 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 05 20:28:19 crc kubenswrapper[4753]: E1005 20:28:19.268164 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5-cert podName:d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5 nodeName:}" failed. No retries permitted until 2025-10-05 20:28:19.768150313 +0000 UTC m=+808.616478545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5-cert") pod "openstack-operator-controller-manager-7cfc658b9-4t94r" (UID: "d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5") : secret "webhook-server-cert" not found Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.280406 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5b974f6766-wsbjp"] Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.291247 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48"] Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.299799 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvkxr\" (UniqueName: \"kubernetes.io/projected/d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5-kube-api-access-rvkxr\") pod \"openstack-operator-controller-manager-7cfc658b9-4t94r\" (UID: \"d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5\") " pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.301003 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtwsk\" (UniqueName: \"kubernetes.io/projected/2d0279fb-be4d-47a0-83c7-4452c7b13a5b-kube-api-access-jtwsk\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr\" (UID: \"2d0279fb-be4d-47a0-83c7-4452c7b13a5b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.368089 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f241e98d-8f7c-492a-a4bc-988dc78b6449-cert\") pod \"infra-operator-controller-manager-84788b6bc5-vksxs\" (UID: \"f241e98d-8f7c-492a-a4bc-988dc78b6449\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" Oct 05 20:28:19 crc kubenswrapper[4753]: E1005 20:28:19.368235 4753 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 05 20:28:19 crc kubenswrapper[4753]: E1005 20:28:19.368275 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f241e98d-8f7c-492a-a4bc-988dc78b6449-cert podName:f241e98d-8f7c-492a-a4bc-988dc78b6449 nodeName:}" failed. No retries permitted until 2025-10-05 20:28:21.368262526 +0000 UTC m=+810.216590758 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f241e98d-8f7c-492a-a4bc-988dc78b6449-cert") pod "infra-operator-controller-manager-84788b6bc5-vksxs" (UID: "f241e98d-8f7c-492a-a4bc-988dc78b6449") : secret "infra-operator-webhook-server-cert" not found Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.424389 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr" Oct 05 20:28:19 crc kubenswrapper[4753]: W1005 20:28:19.494254 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9286136d_f0a7_4488_b346_2b3ea3ab81da.slice/crio-4870656622c9236766796260434cad682617500218d90b48f8f2078bfe37374d WatchSource:0}: Error finding container 4870656622c9236766796260434cad682617500218d90b48f8f2078bfe37374d: Status 404 returned error can't find the container with id 4870656622c9236766796260434cad682617500218d90b48f8f2078bfe37374d Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.566598 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5c497dbdb-txd4b"] Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.797285 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5-cert\") pod \"openstack-operator-controller-manager-7cfc658b9-4t94r\" (UID: \"d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5\") " pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r" Oct 05 20:28:19 crc kubenswrapper[4753]: E1005 20:28:19.798278 4753 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 05 20:28:19 crc kubenswrapper[4753]: E1005 20:28:19.798328 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5-cert podName:d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5 nodeName:}" failed. No retries permitted until 2025-10-05 20:28:20.798312089 +0000 UTC m=+809.646640321 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5-cert") pod "openstack-operator-controller-manager-7cfc658b9-4t94r" (UID: "d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5") : secret "webhook-server-cert" not found Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.815496 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.905213 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81397f28-bb39-4bfc-97dc-ec850c6671a8-utilities\") pod \"81397f28-bb39-4bfc-97dc-ec850c6671a8\" (UID: \"81397f28-bb39-4bfc-97dc-ec850c6671a8\") " Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.905299 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81397f28-bb39-4bfc-97dc-ec850c6671a8-catalog-content\") pod \"81397f28-bb39-4bfc-97dc-ec850c6671a8\" (UID: \"81397f28-bb39-4bfc-97dc-ec850c6671a8\") " Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.905354 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nsn8\" (UniqueName: \"kubernetes.io/projected/81397f28-bb39-4bfc-97dc-ec850c6671a8-kube-api-access-8nsn8\") pod \"81397f28-bb39-4bfc-97dc-ec850c6671a8\" (UID: \"81397f28-bb39-4bfc-97dc-ec850c6671a8\") " Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.905591 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28d42154-af7b-440b-af1b-2ef50ee9edca-cert\") pod \"openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm\" (UID: \"28d42154-af7b-440b-af1b-2ef50ee9edca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.913478 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81397f28-bb39-4bfc-97dc-ec850c6671a8-utilities" (OuterVolumeSpecName: "utilities") pod "81397f28-bb39-4bfc-97dc-ec850c6671a8" (UID: "81397f28-bb39-4bfc-97dc-ec850c6671a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.925721 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28d42154-af7b-440b-af1b-2ef50ee9edca-cert\") pod \"openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm\" (UID: \"28d42154-af7b-440b-af1b-2ef50ee9edca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.942748 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81397f28-bb39-4bfc-97dc-ec850c6671a8-kube-api-access-8nsn8" (OuterVolumeSpecName: "kube-api-access-8nsn8") pod "81397f28-bb39-4bfc-97dc-ec850c6671a8" (UID: "81397f28-bb39-4bfc-97dc-ec850c6671a8"). InnerVolumeSpecName "kube-api-access-8nsn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:28:19 crc kubenswrapper[4753]: I1005 20:28:19.966343 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gqlbb" podUID="92664ac3-02b4-4557-b7d4-2c638b4c082f" containerName="registry-server" probeResult="failure" output=< Oct 05 20:28:19 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 20:28:19 crc kubenswrapper[4753]: > Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.007251 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81397f28-bb39-4bfc-97dc-ec850c6671a8-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.007283 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nsn8\" (UniqueName: \"kubernetes.io/projected/81397f28-bb39-4bfc-97dc-ec850c6671a8-kube-api-access-8nsn8\") on node \"crc\" DevicePath \"\"" Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.029335 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j"] Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.034312 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf"] Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.081950 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.097724 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81397f28-bb39-4bfc-97dc-ec850c6671a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81397f28-bb39-4bfc-97dc-ec850c6671a8" (UID: "81397f28-bb39-4bfc-97dc-ec850c6671a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.099037 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bkds" Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.099618 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bkds" event={"ID":"81397f28-bb39-4bfc-97dc-ec850c6671a8","Type":"ContainerDied","Data":"5608acc3f803d7c9cad2cbe52e89fcb46e73c5387ee94dfb220fa4875f778d4d"} Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.099651 4753 scope.go:117] "RemoveContainer" containerID="0e90761f9bc938106c267c0fcb09bc4a50d5c207e25cadd037395d2dfb8e93ec" Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.105615 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48" event={"ID":"64896158-a10b-4fd9-b232-5ba3fa647a02","Type":"ContainerStarted","Data":"42442a5f11d4dc016ab44b63b42188a8cc2b7f139419cd29e57566b8e3574544"} Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.108915 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81397f28-bb39-4bfc-97dc-ec850c6671a8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.110022 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf" event={"ID":"dd3487ac-89f8-40f1-967e-71f7fada0fe1","Type":"ContainerStarted","Data":"ea3cdb9c0f41902f463d1b04cccf20afe17366a9d62e784f47f7d8f508c05cd4"} Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.112589 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-4wqxw" event={"ID":"9286136d-f0a7-4488-b346-2b3ea3ab81da","Type":"ContainerStarted","Data":"4870656622c9236766796260434cad682617500218d90b48f8f2078bfe37374d"} Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.113631 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j" event={"ID":"d8c88aaa-c54b-4f65-be07-61e23d5a5cd4","Type":"ContainerStarted","Data":"1ef339fdc09cb41898cc1fa9580267a432cc1d3a0761342e8f3f916da1cb97b1"} Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.116120 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-wsbjp" event={"ID":"12ff014d-81e6-4a9e-8197-e28fbfc4a06e","Type":"ContainerStarted","Data":"909903bee3c0f37a7d539362a6dd3ad4d9293b338d85d213ddfa6339a802e7f1"} Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.118648 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-txd4b" event={"ID":"5b8831b7-9250-4ec8-b732-2db04e507cfe","Type":"ContainerStarted","Data":"8311069f3fef871197b9e02b7b9196a9498e735db7b41062fe5cf2826fe286e3"} Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.142756 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5bkds"] Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.152107 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5bkds"] Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.166379 4753 scope.go:117] "RemoveContainer" containerID="9cfcc43cb38e6ef8f9a50a71163b8375a35050db3ea71cc94114cd1b06d2195e" Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.208438 4753 scope.go:117] "RemoveContainer" containerID="e531d7fd60e72d8a444dd23ddc0a261697ff3397136ccd68f930d91b3b96704c" Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.554964 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5"] Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.582953 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f5894c49f-ct2l6"] Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.634393 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj"] Oct 05 20:28:20 crc kubenswrapper[4753]: W1005 20:28:20.635249 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c5e8f9b_e10e_436b_ae33_07a7350f02a1.slice/crio-914ff627349889221cacc1aaf2cbd6e8b9173a304d64f236abca812bc190625a WatchSource:0}: Error finding container 914ff627349889221cacc1aaf2cbd6e8b9173a304d64f236abca812bc190625a: Status 404 returned error can't find the container with id 914ff627349889221cacc1aaf2cbd6e8b9173a304d64f236abca812bc190625a Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.644168 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-6c9b57c67-cp4qf"] Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.708626 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd"] Oct 05 20:28:20 crc kubenswrapper[4753]: W1005 20:28:20.743796 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e514d87_9323_4c3b_a372_60e5c65fa731.slice/crio-a9faeaf6e76ac1a5db85bd717f2146edcb5b9449a4f1209733bd973bc5961bd7 WatchSource:0}: Error finding container a9faeaf6e76ac1a5db85bd717f2146edcb5b9449a4f1209733bd973bc5961bd7: Status 404 returned error can't find the container with id a9faeaf6e76ac1a5db85bd717f2146edcb5b9449a4f1209733bd973bc5961bd7 Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.838833 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5-cert\") pod \"openstack-operator-controller-manager-7cfc658b9-4t94r\" (UID: \"d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5\") " pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r" Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.847848 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5-cert\") pod \"openstack-operator-controller-manager-7cfc658b9-4t94r\" (UID: \"d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5\") " pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r" Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.883497 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r" Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.940043 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f59f9d8-htstg"] Oct 05 20:28:20 crc kubenswrapper[4753]: I1005 20:28:20.960591 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-c968bb45-xsjn9"] Oct 05 20:28:20 crc kubenswrapper[4753]: W1005 20:28:20.977720 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd58d3fcd_368b_4d73_8c29_a181f3bdddee.slice/crio-4e26a684a5e986d3413b864b3ee58e5ae5efb28ce8715094d9e988b51efe50f0 WatchSource:0}: Error finding container 4e26a684a5e986d3413b864b3ee58e5ae5efb28ce8715094d9e988b51efe50f0: Status 404 returned error can't find the container with id 4e26a684a5e986d3413b864b3ee58e5ae5efb28ce8715094d9e988b51efe50f0 Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.071080 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-zqgbl"] Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.085190 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-66f6d6849b-9dsbr"] Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.097211 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-rlsgp"] Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.111441 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-4zhzq"] Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.126688 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr"] Oct 05 20:28:21 crc kubenswrapper[4753]: W1005 20:28:21.148894 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00eefbb7_989e_478d_aad3_ff4d236168f2.slice/crio-d722b354b81c0e5c173ab2423b89cf77d8b952e25b64148cf6f6b05a41007075 WatchSource:0}: Error finding container d722b354b81c0e5c173ab2423b89cf77d8b952e25b64148cf6f6b05a41007075: Status 404 returned error can't find the container with id d722b354b81c0e5c173ab2423b89cf77d8b952e25b64148cf6f6b05a41007075 Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.162783 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd" event={"ID":"9c8b9aa1-e15e-475d-a02e-56b430d50bd1","Type":"ContainerStarted","Data":"504d9f995d6882cca8a6ad4894fcac573a1f8f908ae9b1bbf53f4dffa5e88e9f"} Oct 05 20:28:21 crc kubenswrapper[4753]: W1005 20:28:21.169306 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dd994e1_cb87_48dc_b844_2bdbc8b6e48d.slice/crio-dde056025d9082efef58ce5ace12b81f2caa3dabeae5b0c7986c5016f0378c6f WatchSource:0}: Error finding container dde056025d9082efef58ce5ace12b81f2caa3dabeae5b0c7986c5016f0378c6f: Status 404 returned error can't find the container with id dde056025d9082efef58ce5ace12b81f2caa3dabeae5b0c7986c5016f0378c6f Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.186113 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj" event={"ID":"885f705b-599d-41fe-92cf-ffd000ad5e6e","Type":"ContainerStarted","Data":"51f30ff4bd1b0e274e99dca7e218c15d49ac2be512d745013b3324af287f903c"} Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.187616 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5" event={"ID":"0c5e8f9b-e10e-436b-ae33-07a7350f02a1","Type":"ContainerStarted","Data":"914ff627349889221cacc1aaf2cbd6e8b9173a304d64f236abca812bc190625a"} Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.189668 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-9dsbr" event={"ID":"ff1a796b-8cc7-4c73-842f-7b4a1170b56f","Type":"ContainerStarted","Data":"038adcd0fad59cfea4a98f4f7b1906f8a75d52b04ae28d9d1e50dc4a555df1e2"} Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.201440 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-cp4qf" event={"ID":"3e514d87-9323-4c3b-a372-60e5c65fa731","Type":"ContainerStarted","Data":"a9faeaf6e76ac1a5db85bd717f2146edcb5b9449a4f1209733bd973bc5961bd7"} Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.211826 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-c968bb45-xsjn9" event={"ID":"8ed6d37f-576a-4f14-a98a-65193559d7de","Type":"ContainerStarted","Data":"fb6bbd62452cb41758237306c0bbf70609ce8e855f9703ee05b0d62a3b30ee93"} Oct 05 20:28:21 crc kubenswrapper[4753]: W1005 20:28:21.225236 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod995eda80_87fa_4160_b04e_679668f8d910.slice/crio-083b183ed2170ddedfee780e74bfc16b64e6cb50f9b6e5d8ca02d950e25dbdf5 WatchSource:0}: Error finding container 083b183ed2170ddedfee780e74bfc16b64e6cb50f9b6e5d8ca02d950e25dbdf5: Status 404 returned error can't find the container with id 083b183ed2170ddedfee780e74bfc16b64e6cb50f9b6e5d8ca02d950e25dbdf5 Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.225388 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-ct2l6" event={"ID":"266b0921-1164-46bc-9e78-986f5ded5943","Type":"ContainerStarted","Data":"2f84180751cd0c075c67d3b7d879ab0fafe01cd3675db8e110bb830b60f745e5"} Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.248217 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm"] Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.258296 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-htstg" event={"ID":"d58d3fcd-368b-4d73-8c29-a181f3bdddee","Type":"ContainerStarted","Data":"4e26a684a5e986d3413b864b3ee58e5ae5efb28ce8715094d9e988b51efe50f0"} Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.262663 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs"] Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.299704 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn"] Oct 05 20:28:21 crc kubenswrapper[4753]: E1005 20:28:21.359851 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:d1fad97d2cd602a4f7b6fd6c202464ac117b20e6608c17aa04cadbceb78a498d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:6722a752fb7cbffbae811f6ad6567120fbd4ebbe8c38a83ec2df02850a3276bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:c80fca8178ade7c7eaf9466a74a7cd7e904a7699590ea332acf0ac7bd90e647a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:97cfb1305d59fe73caaaf9d261d08ee16ef3f50e0f3a07fec40fa5c93e5b5190,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:39c22d1455b7278118c6831541db7741712dcc7106a78e8a86ba7e5bcb3a9f23,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:ef02cdf1a68a8cc6ee02f7dcd7e0ca11c828e027069e2e2d6d49c2caa7e6cd70,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:7db347424a8c5998059c5bf84c86a1ef8d582d1ffe39f4887551f2ac85a4915f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:b9a6a9d3bec948e74b8275ada997d8ce394536ad3e2e0fef38ba2f6d643ee560,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:1ee11bb6061765cfedb3c49395f5fb3c0ec537c15e647766fd5f8e9625e587ac,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:0e5405cc4eea290c668815f6ae25b41233c371001037440e0e2752d40aecd438,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:7378049e4fc2b54c20806f9633e831f88d649442facbaf7fa20da005b20da68b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:cb1205166ac37c39e585514884cd7e474b6ec15744290e50c858f2336613532d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:55d94c0ffc3092966f7f5e0539790112ee027faa0e0b7f29972be747dea42e6a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b0c0824763cbfd23b836ee4355015d9f94daa115bcc9ef0ea8b8e8980d5a6213,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:d61ae6cef3d31df7dfa7375cf5d4975c76ef7ec2e306267f9cfcb212846d15a5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:f90535885474f3b5e262f1ce87c7d3724d8b66f83fc9fbd646c2f0914e1f5597,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:307905acc0883211e04209e4e4692a8bae080e4a3a168bffe4c06ec5f73ebc76,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:f8a2d3c173473e08b57b2741614d8f1ef0f469b07038d85a30687e1b6c6ad32f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:32cae89e17a4562d52b7c28e9f286c4741cef602d042e6bba987c535a17c01cc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:1ed863535bbbadc6a3de00ea5fb4ed5a44b22fa475b0001506327b23b8974f16,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:4ac7d4dac6af6bab407e91d373d93fedd735464e08e72c293eb5ba69210c2e2e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:67e705d98cd50f81c948686b39ecdca6883f0d822e03a697e573111acbc47395,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:36cd868f24a1d5d108841eca240cfd6592792f0ad626addd6cf79a79e638ce62,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:da8363d88d3c5ccdd18903977a44e4ef5c315adbb462c2a8987d4260405f9427,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:cf91f2bb1627b4449e6016194b52182cce38432d0549dcd62adbb0026ccfbfb5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:95875e2d73a4ee08c4f6d092dab04dc167cc3c8b156f29acf0dcde8e220b6672,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:c3bf86219494451daaee9dd376e852a4b9994f301d9bf76648b48e5dfc54625f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:2e3729d3462bc8da3bf09d4176fda3bba9b5ca167201cb108b862dd5fb5a8c67,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:0e78c6c4513b82ee20c1fbf3926ff88e3e0e23305716f89cadd25c885a57dc8f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:2d999c2b47273e04dfbacda96d56cd6d32e8ea1c4e8d6f5c3211e9ff06be69e0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:1505d142ca95c7fd6b9a8eec753423ad6c49c98d71fb5c503e938dec10cecb05,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:7e0e75413d684196317f10b3920f576def3a9a2ffe9e3c24de838562836e983e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:0da1f2f9accb91d893fdb3273c6e66481c8d00b84b262c8cd772a6aaa5ecd1c4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:b2f6e417260d4ae6bf70c05bf2243a0b6cac555cd28080772546761015674191,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:2517c11de03cac2a402dff8575a12f7feec80ad815a83a8a286d8de9d8a5ef9f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:14a280dd80040c348605adaff57e7a66371f8be9cec2d9fcf5101cdae20b2cb7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:af91e545665b439e95fb4bb5d33cc8c5b9e8e38c7cdd35f412e32dc30a4c7c7e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:656a6af048f632a755017d320b68a6e8215179253c9530848e1d6a4f9594fff8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:59a967078b7538e3b0d21b2c23a7d9abe69b75f0dccbeb89777f2f6a91c46170,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:6a0403113cf843d54521055ae34d368f350a79adc66a31da34cfbad7517092ee,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:92c22647ba003cacb23ec491fddfedb346ffabc733e85169bc9ffce4598a35f5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:a4a766bfc1ebcb531717646ef6a7f608c7ee58c883c58c37f1ec80d5062bd672,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:d0a4c0ba8b21503531f6658909a4791db89d34c492b2d0a148e1908d4b0fd835,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:2af885158034339e8d0db2f58d2b27aa822df05b0f7a9c43225e7346b8a0aeeb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:eeb788d6769b1cc1bb421db442cfa457fb575be51440ffe37f39ded03e8b911a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:8a111f93924ba33c7c5baaf8058470208e55eb0b113241539fad9ddb0e4ee916,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:9204104284dfd12ee72833529e8f4244e340e5443cdf2ec5ff25fd5e7b89169c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:352cef8108e0f387a8a1b4d4eea464d094ea5eca7b81543b5e1baeb2d5ecca0d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:374b9dc69979de9ffcec44f060dec37b126d02e12b303177ff1904b4fe79c331,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:014f3c73f78e17359b947bf7b5c38284afa6fa123598f96dc3a9b9d480947827,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:1ae2548f40ab4498166685f922b61946708d1204fd792c09d4256c7a5c86121e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:9ba5c3c71e2f86549334c46e4c411e3a612b8ecc23865c9be55341dcb4c40568,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:053c95cc75e5bc6de83a08f3196125bb5fbbfea1795643daf3f1378cbaad5d26,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:26f9f12887eed6bb003c56049af1e56a7da8f8845d78dfaf3d5e9278b5815a30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:29ece1e157dcc25b267e79b7863d35ac453d7751fee005f136beb320a2a888bb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:82a8fbd0df08c8e89109c4aa251f5ffa8055592d416874cdd65f3bff90c06ecf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:5b4c0ddc3f4f06193490b9290007ce0cd098ce0e7ec16c3ccb2759c12d683e51,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:027992dcc115054342aed3fcb65fa94667288d0e443e93bddddc0ceec908da20,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:6e7fc684bf020d3552774ed9a2033634e54e0c946741500f3b93c83a0ede01cd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:4a57e6c31fc2727f0cc09d5ee3832779d43404a4237c2328fb2faf9e3a6d0e50,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:432c5410505a3089d1d5ec3f46b39e551970763b7e2f0655f617d9cbad68e138,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:0226892f8e96ba418dd5adcae8d459a5e069b938e7c7abc37335271f9b3b990d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:d6def164f37ae70f9b26399b679048582f4d357fb7a130af69947b63ba68e829,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:5670f9e696b19b76695bea5d4f9c46ac6494d96282f094de1243d8d7a06453b2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:943eee724277e252795909137538a553ef5284c8103ad01b9be7b0138c66d14d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:948b381cd228ded38b6e51b2f297c70f7033a227b565ee8b12490ac48816d4f8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:db16796e99a5906da65eb9068447771f2a11bb3cd6a13cbc2b9c94205ca114a1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:f62e51af159d4420615c3c9045a540af6c3580389e49e48ea9297d748fc09a9b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:9182307b83cce6c08fd9da227d93f1b94e3840ba5c645776d64b7d6e59333ddf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:942c5542e9e5cb8122f7614dd1e8f34badfcc94ed56cc3cf68cfae744392a290,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:45e561fbabeefac4e9f5da933b26bf960576908daf3b09364ac7890b370d0ccc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:d4062383c3b55d604b842067a258b63caf51dcea7d26447a53cd681105626951,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:d142e6e9123b2a4f16d1d65c5d11a132ae9755c2c8cf429ca7ef8c9cd00f4f42,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:b73bcff68993217b4bb909a1fc0191df006be2e4eef48d9bc65a2e3cb0adba0c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:a9f130c7e66a99c8c29b1d8795310ffd4dfa6eb18df3484b87cbcfed9f285406,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:751bc25410e670688b7691763b11a500ed90b8d6dbb84e682cba0db34e743dd4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:5af48acd9260f889cdcbeb2d43cd83aa6a7f3c12b0a9f0d3cedf43e98aed60d6,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:3bfbd3e9b524a5152ebd5a70d0412b0f2b6c8b2143b30e49ed4528f651700fdf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:4620884099776ce989a689510bcb41ac167d884292e30d31d4d89d2b08b3c0be,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:65a4ba0d6ebb973b3f0fec8bf2acd2cf99862c70b7f499f8507434184533632d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:0637803c6ceebcf9093b3c8f679f9a5a5be77ea52f530b8c52ba830168433fc2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2flq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm_openstack-operators(28d42154-af7b-440b-af1b-2ef50ee9edca): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 05 20:28:21 crc kubenswrapper[4753]: E1005 20:28:21.371722 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k49lz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-f589c7597-58qhn_openstack-operators(82db7b73-2afb-4063-9d64-fc3fa5559e93): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.453106 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f241e98d-8f7c-492a-a4bc-988dc78b6449-cert\") pod \"infra-operator-controller-manager-84788b6bc5-vksxs\" (UID: \"f241e98d-8f7c-492a-a4bc-988dc78b6449\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.460575 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f241e98d-8f7c-492a-a4bc-988dc78b6449-cert\") pod \"infra-operator-controller-manager-84788b6bc5-vksxs\" (UID: \"f241e98d-8f7c-492a-a4bc-988dc78b6449\") " pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.510266 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" Oct 05 20:28:21 crc kubenswrapper[4753]: E1005 20:28:21.836044 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn" podUID="82db7b73-2afb-4063-9d64-fc3fa5559e93" Oct 05 20:28:21 crc kubenswrapper[4753]: E1005 20:28:21.851864 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" podUID="28d42154-af7b-440b-af1b-2ef50ee9edca" Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.891345 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81397f28-bb39-4bfc-97dc-ec850c6671a8" path="/var/lib/kubelet/pods/81397f28-bb39-4bfc-97dc-ec850c6671a8/volumes" Oct 05 20:28:21 crc kubenswrapper[4753]: I1005 20:28:21.915307 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r"] Oct 05 20:28:22 crc kubenswrapper[4753]: W1005 20:28:22.038575 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8bad9ab_af81_4b7c_bf9a_cf1fa60f9be5.slice/crio-cfe3571c7ba33690ae03e1492f4e3ac503e81e076c36404ecc666b841b73a935 WatchSource:0}: Error finding container cfe3571c7ba33690ae03e1492f4e3ac503e81e076c36404ecc666b841b73a935: Status 404 returned error can't find the container with id cfe3571c7ba33690ae03e1492f4e3ac503e81e076c36404ecc666b841b73a935 Oct 05 20:28:22 crc kubenswrapper[4753]: I1005 20:28:22.190854 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs"] Oct 05 20:28:22 crc kubenswrapper[4753]: W1005 20:28:22.252372 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf241e98d_8f7c_492a_a4bc_988dc78b6449.slice/crio-67d03c0c8b4e6ead0943a6718f3782463fcfe21d5da94bdb3365b080abd4c685 WatchSource:0}: Error finding container 67d03c0c8b4e6ead0943a6718f3782463fcfe21d5da94bdb3365b080abd4c685: Status 404 returned error can't find the container with id 67d03c0c8b4e6ead0943a6718f3782463fcfe21d5da94bdb3365b080abd4c685 Oct 05 20:28:22 crc kubenswrapper[4753]: I1005 20:28:22.288944 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs" event={"ID":"96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3","Type":"ContainerStarted","Data":"3bdc26931b63a796f8f29add5bea0d46fee7c526cdccbc67ef02456d85467572"} Oct 05 20:28:22 crc kubenswrapper[4753]: I1005 20:28:22.301797 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn" event={"ID":"82db7b73-2afb-4063-9d64-fc3fa5559e93","Type":"ContainerStarted","Data":"3977839e4d835307a559bb26499fb6f329930a2556064ed7e09a2e41c5fc29b8"} Oct 05 20:28:22 crc kubenswrapper[4753]: I1005 20:28:22.301842 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn" event={"ID":"82db7b73-2afb-4063-9d64-fc3fa5559e93","Type":"ContainerStarted","Data":"ad50f3d3b48260b768751e538753af8b691fd625640dfe64ebb70adc7e4a1568"} Oct 05 20:28:22 crc kubenswrapper[4753]: I1005 20:28:22.303592 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-rlsgp" event={"ID":"00eefbb7-989e-478d-aad3-ff4d236168f2","Type":"ContainerStarted","Data":"d722b354b81c0e5c173ab2423b89cf77d8b952e25b64148cf6f6b05a41007075"} Oct 05 20:28:22 crc kubenswrapper[4753]: E1005 20:28:22.303894 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn" podUID="82db7b73-2afb-4063-9d64-fc3fa5559e93" Oct 05 20:28:22 crc kubenswrapper[4753]: I1005 20:28:22.305506 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr" event={"ID":"2d0279fb-be4d-47a0-83c7-4452c7b13a5b","Type":"ContainerStarted","Data":"2ece43c27311d84b86a5f4610ddebd8590696548dce3f95e68b09aa59cbc7ebc"} Oct 05 20:28:22 crc kubenswrapper[4753]: I1005 20:28:22.312672 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-zqgbl" event={"ID":"8dd994e1-cb87-48dc-b844-2bdbc8b6e48d","Type":"ContainerStarted","Data":"dde056025d9082efef58ce5ace12b81f2caa3dabeae5b0c7986c5016f0378c6f"} Oct 05 20:28:22 crc kubenswrapper[4753]: I1005 20:28:22.314193 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r" event={"ID":"d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5","Type":"ContainerStarted","Data":"cfe3571c7ba33690ae03e1492f4e3ac503e81e076c36404ecc666b841b73a935"} Oct 05 20:28:22 crc kubenswrapper[4753]: I1005 20:28:22.348925 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-4zhzq" event={"ID":"995eda80-87fa-4160-b04e-679668f8d910","Type":"ContainerStarted","Data":"083b183ed2170ddedfee780e74bfc16b64e6cb50f9b6e5d8ca02d950e25dbdf5"} Oct 05 20:28:22 crc kubenswrapper[4753]: I1005 20:28:22.382465 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" event={"ID":"28d42154-af7b-440b-af1b-2ef50ee9edca","Type":"ContainerStarted","Data":"fa3c95581a198a17f9cd81175c6bacab075edafdd40d7ef7537e3f2a5ddf7566"} Oct 05 20:28:22 crc kubenswrapper[4753]: I1005 20:28:22.382521 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" event={"ID":"28d42154-af7b-440b-af1b-2ef50ee9edca","Type":"ContainerStarted","Data":"07b653753b2537f3ce944c32af0c416b947454b057e2e0e2e21b27ee33bbc736"} Oct 05 20:28:22 crc kubenswrapper[4753]: E1005 20:28:22.389178 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" podUID="28d42154-af7b-440b-af1b-2ef50ee9edca" Oct 05 20:28:23 crc kubenswrapper[4753]: I1005 20:28:23.424605 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r" event={"ID":"d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5","Type":"ContainerStarted","Data":"5df735bb04c1c79a3057f7bcab7cd6e25d6f9c58bd89f8267dbd83b4a61ad9dc"} Oct 05 20:28:23 crc kubenswrapper[4753]: I1005 20:28:23.425476 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r" event={"ID":"d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5","Type":"ContainerStarted","Data":"72a29ba0651ff3650d691bb514fd35c418317696b63f5c17c23e891c45595b4d"} Oct 05 20:28:23 crc kubenswrapper[4753]: I1005 20:28:23.425712 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r" Oct 05 20:28:23 crc kubenswrapper[4753]: I1005 20:28:23.433645 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" event={"ID":"f241e98d-8f7c-492a-a4bc-988dc78b6449","Type":"ContainerStarted","Data":"67d03c0c8b4e6ead0943a6718f3782463fcfe21d5da94bdb3365b080abd4c685"} Oct 05 20:28:23 crc kubenswrapper[4753]: E1005 20:28:23.439730 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn" podUID="82db7b73-2afb-4063-9d64-fc3fa5559e93" Oct 05 20:28:23 crc kubenswrapper[4753]: E1005 20:28:23.440644 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" podUID="28d42154-af7b-440b-af1b-2ef50ee9edca" Oct 05 20:28:23 crc kubenswrapper[4753]: I1005 20:28:23.468786 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r" podStartSLOduration=5.468767557 podStartE2EDuration="5.468767557s" podCreationTimestamp="2025-10-05 20:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:28:23.465791425 +0000 UTC m=+812.314119657" watchObservedRunningTime="2025-10-05 20:28:23.468767557 +0000 UTC m=+812.317095789" Oct 05 20:28:28 crc kubenswrapper[4753]: I1005 20:28:28.716238 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:28 crc kubenswrapper[4753]: I1005 20:28:28.783076 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:28 crc kubenswrapper[4753]: I1005 20:28:28.951547 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gqlbb"] Oct 05 20:28:30 crc kubenswrapper[4753]: I1005 20:28:30.503974 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gqlbb" podUID="92664ac3-02b4-4557-b7d4-2c638b4c082f" containerName="registry-server" containerID="cri-o://8c8380e9d964bcce409109dbe53f2348907c854c75248ae2c343ab887aebaca2" gracePeriod=2 Oct 05 20:28:30 crc kubenswrapper[4753]: I1005 20:28:30.889606 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7cfc658b9-4t94r" Oct 05 20:28:31 crc kubenswrapper[4753]: I1005 20:28:31.513486 4753 generic.go:334] "Generic (PLEG): container finished" podID="92664ac3-02b4-4557-b7d4-2c638b4c082f" containerID="8c8380e9d964bcce409109dbe53f2348907c854c75248ae2c343ab887aebaca2" exitCode=0 Oct 05 20:28:31 crc kubenswrapper[4753]: I1005 20:28:31.513524 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqlbb" event={"ID":"92664ac3-02b4-4557-b7d4-2c638b4c082f","Type":"ContainerDied","Data":"8c8380e9d964bcce409109dbe53f2348907c854c75248ae2c343ab887aebaca2"} Oct 05 20:28:35 crc kubenswrapper[4753]: E1005 20:28:35.658954 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:0cefa320e45c741f8bffea583eeb6cf7465c4e0a183ae51614bf4b7677ffcb55" Oct 05 20:28:35 crc kubenswrapper[4753]: E1005 20:28:35.660228 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:0cefa320e45c741f8bffea583eeb6cf7465c4e0a183ae51614bf4b7677ffcb55,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t5sfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5c497dbdb-txd4b_openstack-operators(5b8831b7-9250-4ec8-b732-2db04e507cfe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 20:28:36 crc kubenswrapper[4753]: E1005 20:28:36.136246 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1" Oct 05 20:28:36 crc kubenswrapper[4753]: E1005 20:28:36.136412 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hvh8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-66f6d6849b-9dsbr_openstack-operators(ff1a796b-8cc7-4c73-842f-7b4a1170b56f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 20:28:36 crc kubenswrapper[4753]: E1005 20:28:36.573668 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757" Oct 05 20:28:36 crc kubenswrapper[4753]: E1005 20:28:36.573862 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xfzkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7cb48dbc-f4dp5_openstack-operators(0c5e8f9b-e10e-436b-ae33-07a7350f02a1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 20:28:38 crc kubenswrapper[4753]: E1005 20:28:38.193404 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862" Oct 05 20:28:38 crc kubenswrapper[4753]: E1005 20:28:38.194016 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sgnqn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-69b956fbf6-vtgfd_openstack-operators(9c8b9aa1-e15e-475d-a02e-56b430d50bd1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 20:28:38 crc kubenswrapper[4753]: E1005 20:28:38.633707 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:4cba007c18be1ec9aac2ece7a5ce6444a94afd89f0fb032522811d5bdf5bee73" Oct 05 20:28:38 crc kubenswrapper[4753]: E1005 20:28:38.633906 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:4cba007c18be1ec9aac2ece7a5ce6444a94afd89f0fb032522811d5bdf5bee73,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8h6gf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-6675647785-dqfcj_openstack-operators(885f705b-599d-41fe-92cf-ffd000ad5e6e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 20:28:38 crc kubenswrapper[4753]: E1005 20:28:38.658468 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c8380e9d964bcce409109dbe53f2348907c854c75248ae2c343ab887aebaca2 is running failed: container process not found" containerID="8c8380e9d964bcce409109dbe53f2348907c854c75248ae2c343ab887aebaca2" cmd=["grpc_health_probe","-addr=:50051"] Oct 05 20:28:38 crc kubenswrapper[4753]: E1005 20:28:38.659018 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c8380e9d964bcce409109dbe53f2348907c854c75248ae2c343ab887aebaca2 is running failed: container process not found" containerID="8c8380e9d964bcce409109dbe53f2348907c854c75248ae2c343ab887aebaca2" cmd=["grpc_health_probe","-addr=:50051"] Oct 05 20:28:38 crc kubenswrapper[4753]: E1005 20:28:38.659265 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c8380e9d964bcce409109dbe53f2348907c854c75248ae2c343ab887aebaca2 is running failed: container process not found" containerID="8c8380e9d964bcce409109dbe53f2348907c854c75248ae2c343ab887aebaca2" cmd=["grpc_health_probe","-addr=:50051"] Oct 05 20:28:38 crc kubenswrapper[4753]: E1005 20:28:38.659298 4753 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c8380e9d964bcce409109dbe53f2348907c854c75248ae2c343ab887aebaca2 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-gqlbb" podUID="92664ac3-02b4-4557-b7d4-2c638b4c082f" containerName="registry-server" Oct 05 20:28:39 crc kubenswrapper[4753]: E1005 20:28:39.509425 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:445a1332c0eaaa21a5459d3ffe56a8696a6a61131c39dc7bb47571b251a30830" Oct 05 20:28:39 crc kubenswrapper[4753]: E1005 20:28:39.509657 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:445a1332c0eaaa21a5459d3ffe56a8696a6a61131c39dc7bb47571b251a30830,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2px58,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-84bd8f6848-p5g48_openstack-operators(64896158-a10b-4fd9-b232-5ba3fa647a02): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 20:28:40 crc kubenswrapper[4753]: E1005 20:28:40.030594 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Oct 05 20:28:40 crc kubenswrapper[4753]: E1005 20:28:40.038075 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jtwsk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr_openstack-operators(2d0279fb-be4d-47a0-83c7-4452c7b13a5b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 20:28:40 crc kubenswrapper[4753]: E1005 20:28:40.039286 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr" podUID="2d0279fb-be4d-47a0-83c7-4452c7b13a5b" Oct 05 20:28:40 crc kubenswrapper[4753]: E1005 20:28:40.567356 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr" podUID="2d0279fb-be4d-47a0-83c7-4452c7b13a5b" Oct 05 20:28:41 crc kubenswrapper[4753]: E1005 20:28:41.056899 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:354a1057bb423082aeda16c0209381a05266e90e30e216522c1462be7d4c4610" Oct 05 20:28:41 crc kubenswrapper[4753]: E1005 20:28:41.057213 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:354a1057bb423082aeda16c0209381a05266e90e30e216522c1462be7d4c4610,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gkkth,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-698456cdc6-tnt6j_openstack-operators(d8c88aaa-c54b-4f65-be07-61e23d5a5cd4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 20:28:41 crc kubenswrapper[4753]: E1005 20:28:41.507069 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6" Oct 05 20:28:41 crc kubenswrapper[4753]: E1005 20:28:41.507269 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pbpq4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5d98cc5575-t99zs_openstack-operators(96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 20:28:42 crc kubenswrapper[4753]: E1005 20:28:42.608935 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:daed26a1dce4b8221159a011cbc5905ef63e9887fdca2118e4bcc61f88b5fb0a" Oct 05 20:28:42 crc kubenswrapper[4753]: E1005 20:28:42.609441 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:daed26a1dce4b8221159a011cbc5905ef63e9887fdca2118e4bcc61f88b5fb0a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qlp7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-57c9cdcf57-9kjpf_openstack-operators(dd3487ac-89f8-40f1-967e-71f7fada0fe1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 20:28:43 crc kubenswrapper[4753]: I1005 20:28:43.697895 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:43 crc kubenswrapper[4753]: I1005 20:28:43.757212 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92664ac3-02b4-4557-b7d4-2c638b4c082f-utilities\") pod \"92664ac3-02b4-4557-b7d4-2c638b4c082f\" (UID: \"92664ac3-02b4-4557-b7d4-2c638b4c082f\") " Oct 05 20:28:43 crc kubenswrapper[4753]: I1005 20:28:43.757270 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92664ac3-02b4-4557-b7d4-2c638b4c082f-catalog-content\") pod \"92664ac3-02b4-4557-b7d4-2c638b4c082f\" (UID: \"92664ac3-02b4-4557-b7d4-2c638b4c082f\") " Oct 05 20:28:43 crc kubenswrapper[4753]: I1005 20:28:43.757293 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bsvx\" (UniqueName: \"kubernetes.io/projected/92664ac3-02b4-4557-b7d4-2c638b4c082f-kube-api-access-8bsvx\") pod \"92664ac3-02b4-4557-b7d4-2c638b4c082f\" (UID: \"92664ac3-02b4-4557-b7d4-2c638b4c082f\") " Oct 05 20:28:43 crc kubenswrapper[4753]: I1005 20:28:43.760730 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92664ac3-02b4-4557-b7d4-2c638b4c082f-utilities" (OuterVolumeSpecName: "utilities") pod "92664ac3-02b4-4557-b7d4-2c638b4c082f" (UID: "92664ac3-02b4-4557-b7d4-2c638b4c082f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:28:43 crc kubenswrapper[4753]: I1005 20:28:43.769508 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92664ac3-02b4-4557-b7d4-2c638b4c082f-kube-api-access-8bsvx" (OuterVolumeSpecName: "kube-api-access-8bsvx") pod "92664ac3-02b4-4557-b7d4-2c638b4c082f" (UID: "92664ac3-02b4-4557-b7d4-2c638b4c082f"). InnerVolumeSpecName "kube-api-access-8bsvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:28:43 crc kubenswrapper[4753]: I1005 20:28:43.853311 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92664ac3-02b4-4557-b7d4-2c638b4c082f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92664ac3-02b4-4557-b7d4-2c638b4c082f" (UID: "92664ac3-02b4-4557-b7d4-2c638b4c082f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:28:43 crc kubenswrapper[4753]: I1005 20:28:43.858990 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92664ac3-02b4-4557-b7d4-2c638b4c082f-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:28:43 crc kubenswrapper[4753]: I1005 20:28:43.859029 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92664ac3-02b4-4557-b7d4-2c638b4c082f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:28:43 crc kubenswrapper[4753]: I1005 20:28:43.859068 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bsvx\" (UniqueName: \"kubernetes.io/projected/92664ac3-02b4-4557-b7d4-2c638b4c082f-kube-api-access-8bsvx\") on node \"crc\" DevicePath \"\"" Oct 05 20:28:43 crc kubenswrapper[4753]: E1005 20:28:43.939178 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5" podUID="0c5e8f9b-e10e-436b-ae33-07a7350f02a1" Oct 05 20:28:43 crc kubenswrapper[4753]: E1005 20:28:43.945698 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-txd4b" podUID="5b8831b7-9250-4ec8-b732-2db04e507cfe" Oct 05 20:28:44 crc kubenswrapper[4753]: E1005 20:28:44.179654 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48" podUID="64896158-a10b-4fd9-b232-5ba3fa647a02" Oct 05 20:28:44 crc kubenswrapper[4753]: E1005 20:28:44.199803 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs" podUID="96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3" Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.588189 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-4zhzq" event={"ID":"995eda80-87fa-4160-b04e-679668f8d910","Type":"ContainerStarted","Data":"87ad653c8329cdaa4db169b9cf2a1d073bbbab6316b8c14cd06f11db3da3e7d7"} Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.592694 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" event={"ID":"28d42154-af7b-440b-af1b-2ef50ee9edca","Type":"ContainerStarted","Data":"008532a64de6cc3da901166e7abcb4d61f41de6acd33bdeaf16d8714494039ab"} Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.593660 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.603464 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqlbb" event={"ID":"92664ac3-02b4-4557-b7d4-2c638b4c082f","Type":"ContainerDied","Data":"31150ad5f732415c8d7ddefae1c8070b806eff0bfda4fc414c4efaac4c78363f"} Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.603510 4753 scope.go:117] "RemoveContainer" containerID="8c8380e9d964bcce409109dbe53f2348907c854c75248ae2c343ab887aebaca2" Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.603629 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqlbb" Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.615131 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" event={"ID":"f241e98d-8f7c-492a-a4bc-988dc78b6449","Type":"ContainerStarted","Data":"dbbfcd55dac7bffe56e357bafc8ea0275ddf9476c7f05341b6f98a7b5427ce50"} Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.625097 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48" event={"ID":"64896158-a10b-4fd9-b232-5ba3fa647a02","Type":"ContainerStarted","Data":"ec9e69062aa7a05bd790ed490b52907d06de56b20470e842f665ba6bade58249"} Oct 05 20:28:44 crc kubenswrapper[4753]: E1005 20:28:44.629296 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:445a1332c0eaaa21a5459d3ffe56a8696a6a61131c39dc7bb47571b251a30830\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48" podUID="64896158-a10b-4fd9-b232-5ba3fa647a02" Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.631391 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs" event={"ID":"96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3","Type":"ContainerStarted","Data":"29a9a5dce53191f62126d425e01c7e80ef0a0bba961e3dc7463f42f40ad5cfe4"} Oct 05 20:28:44 crc kubenswrapper[4753]: E1005 20:28:44.632860 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs" podUID="96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3" Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.639321 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" podStartSLOduration=5.299753897 podStartE2EDuration="27.639303223s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:21.359284171 +0000 UTC m=+810.207612403" lastFinishedPulling="2025-10-05 20:28:43.698833497 +0000 UTC m=+832.547161729" observedRunningTime="2025-10-05 20:28:44.628952071 +0000 UTC m=+833.477280303" watchObservedRunningTime="2025-10-05 20:28:44.639303223 +0000 UTC m=+833.487631455" Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.646623 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-txd4b" event={"ID":"5b8831b7-9250-4ec8-b732-2db04e507cfe","Type":"ContainerStarted","Data":"02721de68c979f13c7fdf20374c2cc7db1a136c6ece7f412a583cb18332d4e27"} Oct 05 20:28:44 crc kubenswrapper[4753]: E1005 20:28:44.647969 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:0cefa320e45c741f8bffea583eeb6cf7465c4e0a183ae51614bf4b7677ffcb55\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-txd4b" podUID="5b8831b7-9250-4ec8-b732-2db04e507cfe" Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.670896 4753 scope.go:117] "RemoveContainer" containerID="6c3a564d01e04e96354566e026e93644c270a8bf38a3aaa827d07bf101383fd5" Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.675793 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-rlsgp" event={"ID":"00eefbb7-989e-478d-aad3-ff4d236168f2","Type":"ContainerStarted","Data":"e3f697b1ab8e2d875261a7a799d16eef6e3b7e027eca29ab8a2b057b2c6d2e53"} Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.698065 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gqlbb"] Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.698324 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-cp4qf" event={"ID":"3e514d87-9323-4c3b-a372-60e5c65fa731","Type":"ContainerStarted","Data":"dae75cada7988f74b4ff1abd4fe35d46b45280d716260126b14ef0a8d5fa20e9"} Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.710598 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gqlbb"] Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.722037 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-c968bb45-xsjn9" event={"ID":"8ed6d37f-576a-4f14-a98a-65193559d7de","Type":"ContainerStarted","Data":"a0aa09dc4a06efd8a33341a1ac56879f4bc2dc366f20b8e998e93285b48f3146"} Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.729124 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-ct2l6" event={"ID":"266b0921-1164-46bc-9e78-986f5ded5943","Type":"ContainerStarted","Data":"a4727dc7692e3e930bd0d6c36829846260854a1248ba989d264cbcaaef902434"} Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.730429 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5" event={"ID":"0c5e8f9b-e10e-436b-ae33-07a7350f02a1","Type":"ContainerStarted","Data":"60d69f3650d15db8cea1db5564a58ae5face44d1659e5f8c2e394eca76370521"} Oct 05 20:28:44 crc kubenswrapper[4753]: E1005 20:28:44.731562 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5" podUID="0c5e8f9b-e10e-436b-ae33-07a7350f02a1" Oct 05 20:28:44 crc kubenswrapper[4753]: I1005 20:28:44.773284 4753 scope.go:117] "RemoveContainer" containerID="d4a1491d19705c8430c191101840116073c137fa678930cd6f4777933c65f0f8" Oct 05 20:28:44 crc kubenswrapper[4753]: E1005 20:28:44.934708 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf" podUID="dd3487ac-89f8-40f1-967e-71f7fada0fe1" Oct 05 20:28:45 crc kubenswrapper[4753]: E1005 20:28:45.035793 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj" podUID="885f705b-599d-41fe-92cf-ffd000ad5e6e" Oct 05 20:28:45 crc kubenswrapper[4753]: E1005 20:28:45.055357 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd" podUID="9c8b9aa1-e15e-475d-a02e-56b430d50bd1" Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.737202 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-zqgbl" event={"ID":"8dd994e1-cb87-48dc-b844-2bdbc8b6e48d","Type":"ContainerStarted","Data":"b84e00883388d1c2e4f95bb146d6e4d08af0ba4d18da6a8de3fe055a866c6c98"} Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.738707 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf" event={"ID":"dd3487ac-89f8-40f1-967e-71f7fada0fe1","Type":"ContainerStarted","Data":"ae490f1142dc8190a07146d60743c5a2fb0e91570c827e3c0c325d5cee2380b3"} Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.741251 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-4wqxw" event={"ID":"9286136d-f0a7-4488-b346-2b3ea3ab81da","Type":"ContainerStarted","Data":"74c708e8e594f7e8519de22d048a0acbefa9666b1beaeb3ad2d903f7285cecdc"} Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.742539 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-wsbjp" event={"ID":"12ff014d-81e6-4a9e-8197-e28fbfc4a06e","Type":"ContainerStarted","Data":"39db91c002c69c61d36a0c964a8c01867dfc042e71f0e9876d954b06943b7af9"} Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.743957 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd" event={"ID":"9c8b9aa1-e15e-475d-a02e-56b430d50bd1","Type":"ContainerStarted","Data":"1bd0414eba83582d746f2ee8eedbf36682865a61073f721b37d338670522efd5"} Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.746743 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn" event={"ID":"82db7b73-2afb-4063-9d64-fc3fa5559e93","Type":"ContainerStarted","Data":"486d661a10667160ef270b997fbee472a78020827a752be9dec7bb7305dfe8d4"} Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.746919 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn" Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.752328 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-9dsbr" event={"ID":"ff1a796b-8cc7-4c73-842f-7b4a1170b56f","Type":"ContainerStarted","Data":"d357abb8aef69e76a766a583b1bb054e2be69e99a46c1bf5872bdd930ce5cc39"} Oct 05 20:28:45 crc kubenswrapper[4753]: E1005 20:28:45.753442 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:daed26a1dce4b8221159a011cbc5905ef63e9887fdca2118e4bcc61f88b5fb0a\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf" podUID="dd3487ac-89f8-40f1-967e-71f7fada0fe1" Oct 05 20:28:45 crc kubenswrapper[4753]: E1005 20:28:45.753778 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd" podUID="9c8b9aa1-e15e-475d-a02e-56b430d50bd1" Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.755778 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" event={"ID":"f241e98d-8f7c-492a-a4bc-988dc78b6449","Type":"ContainerStarted","Data":"4e9e5f49965c035fe8c4a9bcd5595d7683de47bbd2d2411790e433eac86e1713"} Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.756436 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.757889 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-c968bb45-xsjn9" event={"ID":"8ed6d37f-576a-4f14-a98a-65193559d7de","Type":"ContainerStarted","Data":"574ea9de317a25dff50b76cd20c99b8a8787dd5943eff325094a0570725ab362"} Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.758250 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-c968bb45-xsjn9" Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.759885 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-ct2l6" event={"ID":"266b0921-1164-46bc-9e78-986f5ded5943","Type":"ContainerStarted","Data":"15c7032515606a94fe25c0518f366cbf0e700419809dc18049b8d5e3c6a98d07"} Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.760241 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-ct2l6" Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.761560 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-rlsgp" event={"ID":"00eefbb7-989e-478d-aad3-ff4d236168f2","Type":"ContainerStarted","Data":"2e92865eeb96ddfda0758e3014f3bd4c4482404e0f1cae2997b5283d8ab95799"} Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.761888 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-rlsgp" Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.763265 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-htstg" event={"ID":"d58d3fcd-368b-4d73-8c29-a181f3bdddee","Type":"ContainerStarted","Data":"89c08f6c6f20ae50dceb22661dff257870f08d21e6b8f242f756eb2cfc4a1e7b"} Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.764696 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj" event={"ID":"885f705b-599d-41fe-92cf-ffd000ad5e6e","Type":"ContainerStarted","Data":"be4ab2e55c0b5a7f6a7eae419f6bb4fd6e0dd62aca6a75cb53aa4ed372458ce9"} Oct 05 20:28:45 crc kubenswrapper[4753]: E1005 20:28:45.770399 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5" podUID="0c5e8f9b-e10e-436b-ae33-07a7350f02a1" Oct 05 20:28:45 crc kubenswrapper[4753]: E1005 20:28:45.770509 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:445a1332c0eaaa21a5459d3ffe56a8696a6a61131c39dc7bb47571b251a30830\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48" podUID="64896158-a10b-4fd9-b232-5ba3fa647a02" Oct 05 20:28:45 crc kubenswrapper[4753]: E1005 20:28:45.771224 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs" podUID="96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3" Oct 05 20:28:45 crc kubenswrapper[4753]: E1005 20:28:45.776719 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:4cba007c18be1ec9aac2ece7a5ce6444a94afd89f0fb032522811d5bdf5bee73\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj" podUID="885f705b-599d-41fe-92cf-ffd000ad5e6e" Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.822785 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-rlsgp" podStartSLOduration=6.2982517 podStartE2EDuration="28.82275688s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:21.162233572 +0000 UTC m=+810.010561804" lastFinishedPulling="2025-10-05 20:28:43.686738752 +0000 UTC m=+832.535066984" observedRunningTime="2025-10-05 20:28:45.799217642 +0000 UTC m=+834.647545874" watchObservedRunningTime="2025-10-05 20:28:45.82275688 +0000 UTC m=+834.671085112" Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.873949 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92664ac3-02b4-4557-b7d4-2c638b4c082f" path="/var/lib/kubelet/pods/92664ac3-02b4-4557-b7d4-2c638b4c082f/volumes" Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.891115 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" podStartSLOduration=7.47591282 podStartE2EDuration="28.891095899s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:22.285834945 +0000 UTC m=+811.134163177" lastFinishedPulling="2025-10-05 20:28:43.701018024 +0000 UTC m=+832.549346256" observedRunningTime="2025-10-05 20:28:45.847088175 +0000 UTC m=+834.695416407" watchObservedRunningTime="2025-10-05 20:28:45.891095899 +0000 UTC m=+834.739424131" Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.907330 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-c968bb45-xsjn9" podStartSLOduration=6.299898871 podStartE2EDuration="28.907309832s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:21.048365311 +0000 UTC m=+809.896693543" lastFinishedPulling="2025-10-05 20:28:43.655776272 +0000 UTC m=+832.504104504" observedRunningTime="2025-10-05 20:28:45.902573556 +0000 UTC m=+834.750901788" watchObservedRunningTime="2025-10-05 20:28:45.907309832 +0000 UTC m=+834.755638064" Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.933067 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn" podStartSLOduration=5.531860934 podStartE2EDuration="27.93304952s" podCreationTimestamp="2025-10-05 20:28:18 +0000 UTC" firstStartedPulling="2025-10-05 20:28:21.371333435 +0000 UTC m=+810.219661667" lastFinishedPulling="2025-10-05 20:28:43.772522031 +0000 UTC m=+832.620850253" observedRunningTime="2025-10-05 20:28:45.928587772 +0000 UTC m=+834.776916014" watchObservedRunningTime="2025-10-05 20:28:45.93304952 +0000 UTC m=+834.781377752" Oct 05 20:28:45 crc kubenswrapper[4753]: E1005 20:28:45.938094 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j" podUID="d8c88aaa-c54b-4f65-be07-61e23d5a5cd4" Oct 05 20:28:45 crc kubenswrapper[4753]: E1005 20:28:45.947707 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-9dsbr" podUID="ff1a796b-8cc7-4c73-842f-7b4a1170b56f" Oct 05 20:28:45 crc kubenswrapper[4753]: I1005 20:28:45.971446 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-ct2l6" podStartSLOduration=5.997555859 podStartE2EDuration="28.971427009s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:20.728567378 +0000 UTC m=+809.576895610" lastFinishedPulling="2025-10-05 20:28:43.702438528 +0000 UTC m=+832.550766760" observedRunningTime="2025-10-05 20:28:45.967180498 +0000 UTC m=+834.815508740" watchObservedRunningTime="2025-10-05 20:28:45.971427009 +0000 UTC m=+834.819755241" Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.771756 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-wsbjp" event={"ID":"12ff014d-81e6-4a9e-8197-e28fbfc4a06e","Type":"ContainerStarted","Data":"d9f1d908c407f2a35a12f0c68df68f2be431c421f40543d3f2f3aced43511177"} Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.771968 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-wsbjp" Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.774254 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-txd4b" event={"ID":"5b8831b7-9250-4ec8-b732-2db04e507cfe","Type":"ContainerStarted","Data":"e11829046be8d8faf4db08f75c4e6105279f597286850e21513d676cd865050c"} Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.774490 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-txd4b" Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.775794 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-cp4qf" event={"ID":"3e514d87-9323-4c3b-a372-60e5c65fa731","Type":"ContainerStarted","Data":"f37bd526877d5d2dcca74ed6464a77feb786bd264c1f34761c4ac81b68125cbf"} Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.775908 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-cp4qf" Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.776903 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j" event={"ID":"d8c88aaa-c54b-4f65-be07-61e23d5a5cd4","Type":"ContainerStarted","Data":"44d26f31f1032fe83d1db13188630395747f618f8b44664b0b73c09f481a3865"} Oct 05 20:28:46 crc kubenswrapper[4753]: E1005 20:28:46.778354 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:354a1057bb423082aeda16c0209381a05266e90e30e216522c1462be7d4c4610\\\"\"" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j" podUID="d8c88aaa-c54b-4f65-be07-61e23d5a5cd4" Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.779270 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-zqgbl" event={"ID":"8dd994e1-cb87-48dc-b844-2bdbc8b6e48d","Type":"ContainerStarted","Data":"fdf4d1c7772d4317cd369662b8fd3e5509d8dc123d574ace8cb33758dabd5118"} Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.779381 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-zqgbl" Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.784310 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-htstg" event={"ID":"d58d3fcd-368b-4d73-8c29-a181f3bdddee","Type":"ContainerStarted","Data":"4bc1ecdef002506d762227ee52bde06a5e36655d7d2017c6b03b77eba334dec2"} Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.784923 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-htstg" Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.789192 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-4wqxw" event={"ID":"9286136d-f0a7-4488-b346-2b3ea3ab81da","Type":"ContainerStarted","Data":"732cd5c16bddaef946a03a81093f288be465bcba844eb2d13bcb9fa31471cecc"} Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.789284 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-4wqxw" Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.791162 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-4zhzq" event={"ID":"995eda80-87fa-4160-b04e-679668f8d910","Type":"ContainerStarted","Data":"0363df696c1e0234c51d2a74070660c8cefe8d11a7a6258cd97e4855bd4d68cd"} Oct 05 20:28:46 crc kubenswrapper[4753]: E1005 20:28:46.793168 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:daed26a1dce4b8221159a011cbc5905ef63e9887fdca2118e4bcc61f88b5fb0a\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf" podUID="dd3487ac-89f8-40f1-967e-71f7fada0fe1" Oct 05 20:28:46 crc kubenswrapper[4753]: E1005 20:28:46.793737 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd" podUID="9c8b9aa1-e15e-475d-a02e-56b430d50bd1" Oct 05 20:28:46 crc kubenswrapper[4753]: E1005 20:28:46.793783 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:4cba007c18be1ec9aac2ece7a5ce6444a94afd89f0fb032522811d5bdf5bee73\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj" podUID="885f705b-599d-41fe-92cf-ffd000ad5e6e" Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.816204 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-zqgbl" podStartSLOduration=6.325752685 podStartE2EDuration="28.816186689s" podCreationTimestamp="2025-10-05 20:28:18 +0000 UTC" firstStartedPulling="2025-10-05 20:28:21.186251366 +0000 UTC m=+810.034579598" lastFinishedPulling="2025-10-05 20:28:43.67668537 +0000 UTC m=+832.525013602" observedRunningTime="2025-10-05 20:28:46.814714173 +0000 UTC m=+835.663042395" watchObservedRunningTime="2025-10-05 20:28:46.816186689 +0000 UTC m=+835.664514921" Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.820311 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-wsbjp" podStartSLOduration=5.548148967 podStartE2EDuration="29.820301216s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:19.383771497 +0000 UTC m=+808.232099729" lastFinishedPulling="2025-10-05 20:28:43.655923756 +0000 UTC m=+832.504251978" observedRunningTime="2025-10-05 20:28:46.798977045 +0000 UTC m=+835.647305277" watchObservedRunningTime="2025-10-05 20:28:46.820301216 +0000 UTC m=+835.668629448" Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.857228 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-txd4b" podStartSLOduration=3.405279045 podStartE2EDuration="29.85720581s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:19.762713315 +0000 UTC m=+808.611041547" lastFinishedPulling="2025-10-05 20:28:46.21464008 +0000 UTC m=+835.062968312" observedRunningTime="2025-10-05 20:28:46.839029516 +0000 UTC m=+835.687357748" watchObservedRunningTime="2025-10-05 20:28:46.85720581 +0000 UTC m=+835.705534052" Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.858275 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-4wqxw" podStartSLOduration=5.713513794 podStartE2EDuration="29.858267743s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:19.532037854 +0000 UTC m=+808.380366086" lastFinishedPulling="2025-10-05 20:28:43.676791803 +0000 UTC m=+832.525120035" observedRunningTime="2025-10-05 20:28:46.853773393 +0000 UTC m=+835.702101635" watchObservedRunningTime="2025-10-05 20:28:46.858267743 +0000 UTC m=+835.706595985" Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.911026 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-76d5577b-4zhzq" podStartSLOduration=6.531593346 podStartE2EDuration="28.911007348s" podCreationTimestamp="2025-10-05 20:28:18 +0000 UTC" firstStartedPulling="2025-10-05 20:28:21.276598317 +0000 UTC m=+810.124926549" lastFinishedPulling="2025-10-05 20:28:43.656012319 +0000 UTC m=+832.504340551" observedRunningTime="2025-10-05 20:28:46.909299715 +0000 UTC m=+835.757627957" watchObservedRunningTime="2025-10-05 20:28:46.911007348 +0000 UTC m=+835.759335580" Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.929727 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-htstg" podStartSLOduration=7.263298799 podStartE2EDuration="29.929708258s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:20.985685679 +0000 UTC m=+809.834013911" lastFinishedPulling="2025-10-05 20:28:43.652095138 +0000 UTC m=+832.500423370" observedRunningTime="2025-10-05 20:28:46.92494259 +0000 UTC m=+835.773270822" watchObservedRunningTime="2025-10-05 20:28:46.929708258 +0000 UTC m=+835.778036490" Oct 05 20:28:46 crc kubenswrapper[4753]: I1005 20:28:46.957856 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-cp4qf" podStartSLOduration=7.00269442 podStartE2EDuration="29.95783428s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:20.750613261 +0000 UTC m=+809.598941493" lastFinishedPulling="2025-10-05 20:28:43.705753121 +0000 UTC m=+832.554081353" observedRunningTime="2025-10-05 20:28:46.95622238 +0000 UTC m=+835.804550612" watchObservedRunningTime="2025-10-05 20:28:46.95783428 +0000 UTC m=+835.806162522" Oct 05 20:28:47 crc kubenswrapper[4753]: I1005 20:28:47.801270 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-9dsbr" event={"ID":"ff1a796b-8cc7-4c73-842f-7b4a1170b56f","Type":"ContainerStarted","Data":"9471ddb1916704d66d503a0aab2afcb0b0ad4cb46e105271732b46f5a10fe2d5"} Oct 05 20:28:47 crc kubenswrapper[4753]: I1005 20:28:47.803437 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-76d5577b-4zhzq" Oct 05 20:28:47 crc kubenswrapper[4753]: I1005 20:28:47.803713 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-9dsbr" Oct 05 20:28:47 crc kubenswrapper[4753]: E1005 20:28:47.811580 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:354a1057bb423082aeda16c0209381a05266e90e30e216522c1462be7d4c4610\\\"\"" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j" podUID="d8c88aaa-c54b-4f65-be07-61e23d5a5cd4" Oct 05 20:28:47 crc kubenswrapper[4753]: I1005 20:28:47.835931 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-9dsbr" podStartSLOduration=4.751887792 podStartE2EDuration="30.835911771s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:21.135050869 +0000 UTC m=+809.983379101" lastFinishedPulling="2025-10-05 20:28:47.219074848 +0000 UTC m=+836.067403080" observedRunningTime="2025-10-05 20:28:47.827972634 +0000 UTC m=+836.676300876" watchObservedRunningTime="2025-10-05 20:28:47.835911771 +0000 UTC m=+836.684240003" Oct 05 20:28:50 crc kubenswrapper[4753]: I1005 20:28:50.088246 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm" Oct 05 20:28:51 crc kubenswrapper[4753]: I1005 20:28:51.521318 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-84788b6bc5-vksxs" Oct 05 20:28:53 crc kubenswrapper[4753]: I1005 20:28:53.842439 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr" event={"ID":"2d0279fb-be4d-47a0-83c7-4452c7b13a5b","Type":"ContainerStarted","Data":"9e64ae47a3adece9386815329fe1756f543ef078f5cc3d70f912a1f2b068ff11"} Oct 05 20:28:53 crc kubenswrapper[4753]: I1005 20:28:53.865704 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr" podStartSLOduration=3.499962371 podStartE2EDuration="35.865686911s" podCreationTimestamp="2025-10-05 20:28:18 +0000 UTC" firstStartedPulling="2025-10-05 20:28:21.212573912 +0000 UTC m=+810.060902144" lastFinishedPulling="2025-10-05 20:28:53.578298452 +0000 UTC m=+842.426626684" observedRunningTime="2025-10-05 20:28:53.862178912 +0000 UTC m=+842.710507154" watchObservedRunningTime="2025-10-05 20:28:53.865686911 +0000 UTC m=+842.714015143" Oct 05 20:28:56 crc kubenswrapper[4753]: I1005 20:28:56.867704 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48" event={"ID":"64896158-a10b-4fd9-b232-5ba3fa647a02","Type":"ContainerStarted","Data":"daf5c84eee46dc6cc2f73965457cc71317231a45e78e750e696d04f39ec3b8ce"} Oct 05 20:28:56 crc kubenswrapper[4753]: I1005 20:28:56.868421 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48" Oct 05 20:28:56 crc kubenswrapper[4753]: I1005 20:28:56.882281 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48" podStartSLOduration=3.014617914 podStartE2EDuration="39.882265129s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:19.461406604 +0000 UTC m=+808.309734836" lastFinishedPulling="2025-10-05 20:28:56.329053799 +0000 UTC m=+845.177382051" observedRunningTime="2025-10-05 20:28:56.880561616 +0000 UTC m=+845.728889848" watchObservedRunningTime="2025-10-05 20:28:56.882265129 +0000 UTC m=+845.730593361" Oct 05 20:28:57 crc kubenswrapper[4753]: I1005 20:28:57.714179 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5b974f6766-wsbjp" Oct 05 20:28:57 crc kubenswrapper[4753]: I1005 20:28:57.772868 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-4wqxw" Oct 05 20:28:57 crc kubenswrapper[4753]: I1005 20:28:57.872214 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5c497dbdb-txd4b" Oct 05 20:28:57 crc kubenswrapper[4753]: I1005 20:28:57.989481 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f5894c49f-ct2l6" Oct 05 20:28:58 crc kubenswrapper[4753]: I1005 20:28:58.363228 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-d6c9dc5bc-rlsgp" Oct 05 20:28:58 crc kubenswrapper[4753]: I1005 20:28:58.396665 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-6c9b57c67-cp4qf" Oct 05 20:28:58 crc kubenswrapper[4753]: I1005 20:28:58.531698 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f59f9d8-htstg" Oct 05 20:28:58 crc kubenswrapper[4753]: I1005 20:28:58.627499 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-c968bb45-xsjn9" Oct 05 20:28:58 crc kubenswrapper[4753]: I1005 20:28:58.766581 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-66f6d6849b-9dsbr" Oct 05 20:28:58 crc kubenswrapper[4753]: I1005 20:28:58.876414 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-76d5577b-4zhzq" Oct 05 20:28:58 crc kubenswrapper[4753]: I1005 20:28:58.883805 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj" event={"ID":"885f705b-599d-41fe-92cf-ffd000ad5e6e","Type":"ContainerStarted","Data":"9733aae4046aab98cc2acfea78411bf48a9375b7f14f6ef966915dcd95183a7b"} Oct 05 20:28:58 crc kubenswrapper[4753]: I1005 20:28:58.883987 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj" Oct 05 20:28:58 crc kubenswrapper[4753]: I1005 20:28:58.911556 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj" podStartSLOduration=4.112251604 podStartE2EDuration="41.91153999s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:20.745959827 +0000 UTC m=+809.594288059" lastFinishedPulling="2025-10-05 20:28:58.545248213 +0000 UTC m=+847.393576445" observedRunningTime="2025-10-05 20:28:58.908527936 +0000 UTC m=+847.756856168" watchObservedRunningTime="2025-10-05 20:28:58.91153999 +0000 UTC m=+847.759868222" Oct 05 20:28:58 crc kubenswrapper[4753]: I1005 20:28:58.920986 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-f589c7597-58qhn" Oct 05 20:28:59 crc kubenswrapper[4753]: I1005 20:28:59.112171 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-zqgbl" Oct 05 20:28:59 crc kubenswrapper[4753]: I1005 20:28:59.892628 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs" event={"ID":"96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3","Type":"ContainerStarted","Data":"a25f9b296ece00d33eadfd4278edb2b5298aff808e1cfae0a6ce16939f08e43a"} Oct 05 20:28:59 crc kubenswrapper[4753]: I1005 20:28:59.892849 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs" Oct 05 20:28:59 crc kubenswrapper[4753]: I1005 20:28:59.895014 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd" event={"ID":"9c8b9aa1-e15e-475d-a02e-56b430d50bd1","Type":"ContainerStarted","Data":"4838670e7a582dfe876450f6347e963c3f56aef9926d8864ce24a4befed701e4"} Oct 05 20:28:59 crc kubenswrapper[4753]: I1005 20:28:59.895345 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd" Oct 05 20:28:59 crc kubenswrapper[4753]: I1005 20:28:59.912541 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs" podStartSLOduration=3.865337539 podStartE2EDuration="41.912521661s" podCreationTimestamp="2025-10-05 20:28:18 +0000 UTC" firstStartedPulling="2025-10-05 20:28:21.277876367 +0000 UTC m=+810.126204599" lastFinishedPulling="2025-10-05 20:28:59.325060489 +0000 UTC m=+848.173388721" observedRunningTime="2025-10-05 20:28:59.907057122 +0000 UTC m=+848.755385374" watchObservedRunningTime="2025-10-05 20:28:59.912521661 +0000 UTC m=+848.760849914" Oct 05 20:28:59 crc kubenswrapper[4753]: I1005 20:28:59.929370 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd" podStartSLOduration=4.426529395 podStartE2EDuration="42.929352383s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:20.823612434 +0000 UTC m=+809.671940666" lastFinishedPulling="2025-10-05 20:28:59.326435422 +0000 UTC m=+848.174763654" observedRunningTime="2025-10-05 20:28:59.924503513 +0000 UTC m=+848.772831755" watchObservedRunningTime="2025-10-05 20:28:59.929352383 +0000 UTC m=+848.777680615" Oct 05 20:29:00 crc kubenswrapper[4753]: I1005 20:29:00.907979 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf" event={"ID":"dd3487ac-89f8-40f1-967e-71f7fada0fe1","Type":"ContainerStarted","Data":"de77c5c96e16617ba8a5faed38e4af224bafde5f7f78e54264dbfac55b4c1554"} Oct 05 20:29:00 crc kubenswrapper[4753]: I1005 20:29:00.937408 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf" podStartSLOduration=3.752158279 podStartE2EDuration="43.937374373s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:20.078121343 +0000 UTC m=+808.926449575" lastFinishedPulling="2025-10-05 20:29:00.263337437 +0000 UTC m=+849.111665669" observedRunningTime="2025-10-05 20:29:00.926371672 +0000 UTC m=+849.774699944" watchObservedRunningTime="2025-10-05 20:29:00.937374373 +0000 UTC m=+849.785702645" Oct 05 20:29:01 crc kubenswrapper[4753]: I1005 20:29:01.933106 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5" event={"ID":"0c5e8f9b-e10e-436b-ae33-07a7350f02a1","Type":"ContainerStarted","Data":"3ce9fb0ff403c069ab0a07192fed150b71678783939326cba9c38469d987fb1c"} Oct 05 20:29:01 crc kubenswrapper[4753]: I1005 20:29:01.935239 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5" Oct 05 20:29:01 crc kubenswrapper[4753]: I1005 20:29:01.976963 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5" podStartSLOduration=4.271649514 podStartE2EDuration="44.976944511s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:20.63735214 +0000 UTC m=+809.485680372" lastFinishedPulling="2025-10-05 20:29:01.342647137 +0000 UTC m=+850.190975369" observedRunningTime="2025-10-05 20:29:01.970818941 +0000 UTC m=+850.819147193" watchObservedRunningTime="2025-10-05 20:29:01.976944511 +0000 UTC m=+850.825272763" Oct 05 20:29:02 crc kubenswrapper[4753]: I1005 20:29:02.942475 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j" event={"ID":"d8c88aaa-c54b-4f65-be07-61e23d5a5cd4","Type":"ContainerStarted","Data":"6acf743cc9e88b5b368bd7f322339ee192169d39fe0b88227d881e10d34b9936"} Oct 05 20:29:02 crc kubenswrapper[4753]: I1005 20:29:02.943011 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j" Oct 05 20:29:02 crc kubenswrapper[4753]: I1005 20:29:02.964738 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j" podStartSLOduration=3.729813207 podStartE2EDuration="45.964707933s" podCreationTimestamp="2025-10-05 20:28:17 +0000 UTC" firstStartedPulling="2025-10-05 20:28:20.070724084 +0000 UTC m=+808.919052316" lastFinishedPulling="2025-10-05 20:29:02.30561879 +0000 UTC m=+851.153947042" observedRunningTime="2025-10-05 20:29:02.961091481 +0000 UTC m=+851.809419703" watchObservedRunningTime="2025-10-05 20:29:02.964707933 +0000 UTC m=+851.813036205" Oct 05 20:29:07 crc kubenswrapper[4753]: I1005 20:29:07.743648 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-84bd8f6848-p5g48" Oct 05 20:29:07 crc kubenswrapper[4753]: I1005 20:29:07.878285 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-698456cdc6-tnt6j" Oct 05 20:29:08 crc kubenswrapper[4753]: I1005 20:29:08.180885 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6675647785-dqfcj" Oct 05 20:29:08 crc kubenswrapper[4753]: I1005 20:29:08.241784 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf" Oct 05 20:29:08 crc kubenswrapper[4753]: I1005 20:29:08.244640 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-57c9cdcf57-9kjpf" Oct 05 20:29:08 crc kubenswrapper[4753]: I1005 20:29:08.255663 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7cb48dbc-f4dp5" Oct 05 20:29:08 crc kubenswrapper[4753]: I1005 20:29:08.373193 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-69b956fbf6-vtgfd" Oct 05 20:29:08 crc kubenswrapper[4753]: I1005 20:29:08.926803 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5d98cc5575-t99zs" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.625169 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-546d69f86c-vvp27"] Oct 05 20:29:26 crc kubenswrapper[4753]: E1005 20:29:26.625871 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81397f28-bb39-4bfc-97dc-ec850c6671a8" containerName="extract-utilities" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.625948 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="81397f28-bb39-4bfc-97dc-ec850c6671a8" containerName="extract-utilities" Oct 05 20:29:26 crc kubenswrapper[4753]: E1005 20:29:26.625968 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92664ac3-02b4-4557-b7d4-2c638b4c082f" containerName="extract-content" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.625974 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="92664ac3-02b4-4557-b7d4-2c638b4c082f" containerName="extract-content" Oct 05 20:29:26 crc kubenswrapper[4753]: E1005 20:29:26.625991 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81397f28-bb39-4bfc-97dc-ec850c6671a8" containerName="registry-server" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.625997 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="81397f28-bb39-4bfc-97dc-ec850c6671a8" containerName="registry-server" Oct 05 20:29:26 crc kubenswrapper[4753]: E1005 20:29:26.626020 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81397f28-bb39-4bfc-97dc-ec850c6671a8" containerName="extract-content" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.626025 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="81397f28-bb39-4bfc-97dc-ec850c6671a8" containerName="extract-content" Oct 05 20:29:26 crc kubenswrapper[4753]: E1005 20:29:26.626032 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92664ac3-02b4-4557-b7d4-2c638b4c082f" containerName="extract-utilities" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.626038 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="92664ac3-02b4-4557-b7d4-2c638b4c082f" containerName="extract-utilities" Oct 05 20:29:26 crc kubenswrapper[4753]: E1005 20:29:26.626053 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92664ac3-02b4-4557-b7d4-2c638b4c082f" containerName="registry-server" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.626058 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="92664ac3-02b4-4557-b7d4-2c638b4c082f" containerName="registry-server" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.626210 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="81397f28-bb39-4bfc-97dc-ec850c6671a8" containerName="registry-server" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.626246 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="92664ac3-02b4-4557-b7d4-2c638b4c082f" containerName="registry-server" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.632752 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546d69f86c-vvp27" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.641835 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-546d69f86c-vvp27"] Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.650501 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.650656 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-9z59t" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.650702 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.650715 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.795556 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppj4b\" (UniqueName: \"kubernetes.io/projected/9d6ca1dc-5308-4633-bf0e-ce97c2887028-kube-api-access-ppj4b\") pod \"dnsmasq-dns-546d69f86c-vvp27\" (UID: \"9d6ca1dc-5308-4633-bf0e-ce97c2887028\") " pod="openstack/dnsmasq-dns-546d69f86c-vvp27" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.795632 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6ca1dc-5308-4633-bf0e-ce97c2887028-config\") pod \"dnsmasq-dns-546d69f86c-vvp27\" (UID: \"9d6ca1dc-5308-4633-bf0e-ce97c2887028\") " pod="openstack/dnsmasq-dns-546d69f86c-vvp27" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.831161 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f9579fb85-q6s4w"] Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.832259 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9579fb85-q6s4w" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.834385 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.849814 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f9579fb85-q6s4w"] Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.896663 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppj4b\" (UniqueName: \"kubernetes.io/projected/9d6ca1dc-5308-4633-bf0e-ce97c2887028-kube-api-access-ppj4b\") pod \"dnsmasq-dns-546d69f86c-vvp27\" (UID: \"9d6ca1dc-5308-4633-bf0e-ce97c2887028\") " pod="openstack/dnsmasq-dns-546d69f86c-vvp27" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.896737 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6ca1dc-5308-4633-bf0e-ce97c2887028-config\") pod \"dnsmasq-dns-546d69f86c-vvp27\" (UID: \"9d6ca1dc-5308-4633-bf0e-ce97c2887028\") " pod="openstack/dnsmasq-dns-546d69f86c-vvp27" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.897618 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6ca1dc-5308-4633-bf0e-ce97c2887028-config\") pod \"dnsmasq-dns-546d69f86c-vvp27\" (UID: \"9d6ca1dc-5308-4633-bf0e-ce97c2887028\") " pod="openstack/dnsmasq-dns-546d69f86c-vvp27" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.927630 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppj4b\" (UniqueName: \"kubernetes.io/projected/9d6ca1dc-5308-4633-bf0e-ce97c2887028-kube-api-access-ppj4b\") pod \"dnsmasq-dns-546d69f86c-vvp27\" (UID: \"9d6ca1dc-5308-4633-bf0e-ce97c2887028\") " pod="openstack/dnsmasq-dns-546d69f86c-vvp27" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.959583 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546d69f86c-vvp27" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.998816 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/736a2667-a356-4164-ad9c-80eb9b2d87ea-dns-svc\") pod \"dnsmasq-dns-7f9579fb85-q6s4w\" (UID: \"736a2667-a356-4164-ad9c-80eb9b2d87ea\") " pod="openstack/dnsmasq-dns-7f9579fb85-q6s4w" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.998864 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736a2667-a356-4164-ad9c-80eb9b2d87ea-config\") pod \"dnsmasq-dns-7f9579fb85-q6s4w\" (UID: \"736a2667-a356-4164-ad9c-80eb9b2d87ea\") " pod="openstack/dnsmasq-dns-7f9579fb85-q6s4w" Oct 05 20:29:26 crc kubenswrapper[4753]: I1005 20:29:26.998902 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppwgq\" (UniqueName: \"kubernetes.io/projected/736a2667-a356-4164-ad9c-80eb9b2d87ea-kube-api-access-ppwgq\") pod \"dnsmasq-dns-7f9579fb85-q6s4w\" (UID: \"736a2667-a356-4164-ad9c-80eb9b2d87ea\") " pod="openstack/dnsmasq-dns-7f9579fb85-q6s4w" Oct 05 20:29:27 crc kubenswrapper[4753]: I1005 20:29:27.105900 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppwgq\" (UniqueName: \"kubernetes.io/projected/736a2667-a356-4164-ad9c-80eb9b2d87ea-kube-api-access-ppwgq\") pod \"dnsmasq-dns-7f9579fb85-q6s4w\" (UID: \"736a2667-a356-4164-ad9c-80eb9b2d87ea\") " pod="openstack/dnsmasq-dns-7f9579fb85-q6s4w" Oct 05 20:29:27 crc kubenswrapper[4753]: I1005 20:29:27.105992 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/736a2667-a356-4164-ad9c-80eb9b2d87ea-dns-svc\") pod \"dnsmasq-dns-7f9579fb85-q6s4w\" (UID: \"736a2667-a356-4164-ad9c-80eb9b2d87ea\") " pod="openstack/dnsmasq-dns-7f9579fb85-q6s4w" Oct 05 20:29:27 crc kubenswrapper[4753]: I1005 20:29:27.106026 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736a2667-a356-4164-ad9c-80eb9b2d87ea-config\") pod \"dnsmasq-dns-7f9579fb85-q6s4w\" (UID: \"736a2667-a356-4164-ad9c-80eb9b2d87ea\") " pod="openstack/dnsmasq-dns-7f9579fb85-q6s4w" Oct 05 20:29:27 crc kubenswrapper[4753]: I1005 20:29:27.107268 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/736a2667-a356-4164-ad9c-80eb9b2d87ea-dns-svc\") pod \"dnsmasq-dns-7f9579fb85-q6s4w\" (UID: \"736a2667-a356-4164-ad9c-80eb9b2d87ea\") " pod="openstack/dnsmasq-dns-7f9579fb85-q6s4w" Oct 05 20:29:27 crc kubenswrapper[4753]: I1005 20:29:27.108395 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736a2667-a356-4164-ad9c-80eb9b2d87ea-config\") pod \"dnsmasq-dns-7f9579fb85-q6s4w\" (UID: \"736a2667-a356-4164-ad9c-80eb9b2d87ea\") " pod="openstack/dnsmasq-dns-7f9579fb85-q6s4w" Oct 05 20:29:27 crc kubenswrapper[4753]: I1005 20:29:27.129105 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppwgq\" (UniqueName: \"kubernetes.io/projected/736a2667-a356-4164-ad9c-80eb9b2d87ea-kube-api-access-ppwgq\") pod \"dnsmasq-dns-7f9579fb85-q6s4w\" (UID: \"736a2667-a356-4164-ad9c-80eb9b2d87ea\") " pod="openstack/dnsmasq-dns-7f9579fb85-q6s4w" Oct 05 20:29:27 crc kubenswrapper[4753]: I1005 20:29:27.148720 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9579fb85-q6s4w" Oct 05 20:29:27 crc kubenswrapper[4753]: I1005 20:29:27.236742 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-546d69f86c-vvp27"] Oct 05 20:29:27 crc kubenswrapper[4753]: I1005 20:29:27.286568 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 05 20:29:27 crc kubenswrapper[4753]: I1005 20:29:27.395547 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f9579fb85-q6s4w"] Oct 05 20:29:27 crc kubenswrapper[4753]: W1005 20:29:27.401315 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod736a2667_a356_4164_ad9c_80eb9b2d87ea.slice/crio-be5296592ce8312f1293242a2904257d69a1a4da59985d30583c2d07038a6811 WatchSource:0}: Error finding container be5296592ce8312f1293242a2904257d69a1a4da59985d30583c2d07038a6811: Status 404 returned error can't find the container with id be5296592ce8312f1293242a2904257d69a1a4da59985d30583c2d07038a6811 Oct 05 20:29:28 crc kubenswrapper[4753]: I1005 20:29:28.167748 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546d69f86c-vvp27" event={"ID":"9d6ca1dc-5308-4633-bf0e-ce97c2887028","Type":"ContainerStarted","Data":"803394cc9d5d0358280ef2a03bca085705afd8240da3a4ea41a0aa7ce314954c"} Oct 05 20:29:28 crc kubenswrapper[4753]: I1005 20:29:28.169192 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9579fb85-q6s4w" event={"ID":"736a2667-a356-4164-ad9c-80eb9b2d87ea","Type":"ContainerStarted","Data":"be5296592ce8312f1293242a2904257d69a1a4da59985d30583c2d07038a6811"} Oct 05 20:29:29 crc kubenswrapper[4753]: I1005 20:29:29.750333 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-546d69f86c-vvp27"] Oct 05 20:29:29 crc kubenswrapper[4753]: I1005 20:29:29.775654 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bf999f689-d4wgh"] Oct 05 20:29:29 crc kubenswrapper[4753]: I1005 20:29:29.777073 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf999f689-d4wgh" Oct 05 20:29:29 crc kubenswrapper[4753]: I1005 20:29:29.793089 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf999f689-d4wgh"] Oct 05 20:29:29 crc kubenswrapper[4753]: I1005 20:29:29.953936 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/867c0775-f6f9-4fae-ba12-70c68813f8eb-config\") pod \"dnsmasq-dns-bf999f689-d4wgh\" (UID: \"867c0775-f6f9-4fae-ba12-70c68813f8eb\") " pod="openstack/dnsmasq-dns-bf999f689-d4wgh" Oct 05 20:29:29 crc kubenswrapper[4753]: I1005 20:29:29.953990 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/867c0775-f6f9-4fae-ba12-70c68813f8eb-dns-svc\") pod \"dnsmasq-dns-bf999f689-d4wgh\" (UID: \"867c0775-f6f9-4fae-ba12-70c68813f8eb\") " pod="openstack/dnsmasq-dns-bf999f689-d4wgh" Oct 05 20:29:29 crc kubenswrapper[4753]: I1005 20:29:29.954017 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhmv9\" (UniqueName: \"kubernetes.io/projected/867c0775-f6f9-4fae-ba12-70c68813f8eb-kube-api-access-mhmv9\") pod \"dnsmasq-dns-bf999f689-d4wgh\" (UID: \"867c0775-f6f9-4fae-ba12-70c68813f8eb\") " pod="openstack/dnsmasq-dns-bf999f689-d4wgh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.066049 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/867c0775-f6f9-4fae-ba12-70c68813f8eb-config\") pod \"dnsmasq-dns-bf999f689-d4wgh\" (UID: \"867c0775-f6f9-4fae-ba12-70c68813f8eb\") " pod="openstack/dnsmasq-dns-bf999f689-d4wgh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.066122 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/867c0775-f6f9-4fae-ba12-70c68813f8eb-dns-svc\") pod \"dnsmasq-dns-bf999f689-d4wgh\" (UID: \"867c0775-f6f9-4fae-ba12-70c68813f8eb\") " pod="openstack/dnsmasq-dns-bf999f689-d4wgh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.066159 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhmv9\" (UniqueName: \"kubernetes.io/projected/867c0775-f6f9-4fae-ba12-70c68813f8eb-kube-api-access-mhmv9\") pod \"dnsmasq-dns-bf999f689-d4wgh\" (UID: \"867c0775-f6f9-4fae-ba12-70c68813f8eb\") " pod="openstack/dnsmasq-dns-bf999f689-d4wgh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.067308 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/867c0775-f6f9-4fae-ba12-70c68813f8eb-config\") pod \"dnsmasq-dns-bf999f689-d4wgh\" (UID: \"867c0775-f6f9-4fae-ba12-70c68813f8eb\") " pod="openstack/dnsmasq-dns-bf999f689-d4wgh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.067814 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/867c0775-f6f9-4fae-ba12-70c68813f8eb-dns-svc\") pod \"dnsmasq-dns-bf999f689-d4wgh\" (UID: \"867c0775-f6f9-4fae-ba12-70c68813f8eb\") " pod="openstack/dnsmasq-dns-bf999f689-d4wgh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.112793 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhmv9\" (UniqueName: \"kubernetes.io/projected/867c0775-f6f9-4fae-ba12-70c68813f8eb-kube-api-access-mhmv9\") pod \"dnsmasq-dns-bf999f689-d4wgh\" (UID: \"867c0775-f6f9-4fae-ba12-70c68813f8eb\") " pod="openstack/dnsmasq-dns-bf999f689-d4wgh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.348613 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f9579fb85-q6s4w"] Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.387263 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d4d9f7875-wd8wh"] Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.388573 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.403586 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf999f689-d4wgh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.405604 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d4d9f7875-wd8wh"] Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.575827 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0adcc08-1f43-48d5-936d-83797872bb43-config\") pod \"dnsmasq-dns-5d4d9f7875-wd8wh\" (UID: \"c0adcc08-1f43-48d5-936d-83797872bb43\") " pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.575909 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0adcc08-1f43-48d5-936d-83797872bb43-dns-svc\") pod \"dnsmasq-dns-5d4d9f7875-wd8wh\" (UID: \"c0adcc08-1f43-48d5-936d-83797872bb43\") " pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.575930 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdnk6\" (UniqueName: \"kubernetes.io/projected/c0adcc08-1f43-48d5-936d-83797872bb43-kube-api-access-wdnk6\") pod \"dnsmasq-dns-5d4d9f7875-wd8wh\" (UID: \"c0adcc08-1f43-48d5-936d-83797872bb43\") " pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.679217 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0adcc08-1f43-48d5-936d-83797872bb43-config\") pod \"dnsmasq-dns-5d4d9f7875-wd8wh\" (UID: \"c0adcc08-1f43-48d5-936d-83797872bb43\") " pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.679294 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0adcc08-1f43-48d5-936d-83797872bb43-dns-svc\") pod \"dnsmasq-dns-5d4d9f7875-wd8wh\" (UID: \"c0adcc08-1f43-48d5-936d-83797872bb43\") " pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.679314 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdnk6\" (UniqueName: \"kubernetes.io/projected/c0adcc08-1f43-48d5-936d-83797872bb43-kube-api-access-wdnk6\") pod \"dnsmasq-dns-5d4d9f7875-wd8wh\" (UID: \"c0adcc08-1f43-48d5-936d-83797872bb43\") " pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.680761 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0adcc08-1f43-48d5-936d-83797872bb43-dns-svc\") pod \"dnsmasq-dns-5d4d9f7875-wd8wh\" (UID: \"c0adcc08-1f43-48d5-936d-83797872bb43\") " pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.682253 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0adcc08-1f43-48d5-936d-83797872bb43-config\") pod \"dnsmasq-dns-5d4d9f7875-wd8wh\" (UID: \"c0adcc08-1f43-48d5-936d-83797872bb43\") " pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.714977 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdnk6\" (UniqueName: \"kubernetes.io/projected/c0adcc08-1f43-48d5-936d-83797872bb43-kube-api-access-wdnk6\") pod \"dnsmasq-dns-5d4d9f7875-wd8wh\" (UID: \"c0adcc08-1f43-48d5-936d-83797872bb43\") " pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.716799 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.995812 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 05 20:29:30 crc kubenswrapper[4753]: I1005 20:29:30.996901 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.000836 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.000869 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.000998 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.001115 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7sckp" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.001253 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.006706 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.007523 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.016308 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.100003 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf999f689-d4wgh"] Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.191386 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.191420 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.191450 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.191493 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.191512 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.191533 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.191546 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.191616 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.191639 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcjqf\" (UniqueName: \"kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-kube-api-access-kcjqf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.191660 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.191702 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.192078 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf999f689-d4wgh" event={"ID":"867c0775-f6f9-4fae-ba12-70c68813f8eb","Type":"ContainerStarted","Data":"f75f7c696cdec87f4807184227f0f5e5533448282473aecd762edf3a80d2890b"} Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.293182 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.293274 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.293305 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.293331 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.293349 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.293367 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.293384 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.293414 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.293435 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcjqf\" (UniqueName: \"kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-kube-api-access-kcjqf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.293456 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.293497 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.294059 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.294425 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.295021 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.296036 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.296382 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.296810 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.301789 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.304340 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.304774 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.304986 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.313564 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcjqf\" (UniqueName: \"kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-kube-api-access-kcjqf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.325941 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.386208 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d4d9f7875-wd8wh"] Oct 05 20:29:31 crc kubenswrapper[4753]: W1005 20:29:31.388771 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0adcc08_1f43_48d5_936d_83797872bb43.slice/crio-f689e810df661a24228e3db72b23b19a7827c455161d1c4d056ae16af3838ea6 WatchSource:0}: Error finding container f689e810df661a24228e3db72b23b19a7827c455161d1c4d056ae16af3838ea6: Status 404 returned error can't find the container with id f689e810df661a24228e3db72b23b19a7827c455161d1c4d056ae16af3838ea6 Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.544010 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.545987 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.547806 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.547758 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.554513 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.554669 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.554779 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.554859 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.555324 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-4zvh4" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.562656 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.630721 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.701992 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.702055 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.702091 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.702113 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.702140 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.702162 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d7wn\" (UniqueName: \"kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-kube-api-access-2d7wn\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.702195 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.702215 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.702241 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.702273 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.702301 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-config-data\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.804138 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.804883 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.804909 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.804928 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.804953 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.805005 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d7wn\" (UniqueName: \"kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-kube-api-access-2d7wn\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.805024 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.805043 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.805068 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.805097 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.805116 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-config-data\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.806127 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.806493 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.806730 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.806973 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.807107 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.807562 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-config-data\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.808439 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.808768 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.810553 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.818621 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.821925 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d7wn\" (UniqueName: \"kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-kube-api-access-2d7wn\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.844942 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " pod="openstack/rabbitmq-server-0" Oct 05 20:29:31 crc kubenswrapper[4753]: I1005 20:29:31.880393 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 05 20:29:32 crc kubenswrapper[4753]: I1005 20:29:32.117178 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 05 20:29:32 crc kubenswrapper[4753]: I1005 20:29:32.202334 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" event={"ID":"c0adcc08-1f43-48d5-936d-83797872bb43","Type":"ContainerStarted","Data":"f689e810df661a24228e3db72b23b19a7827c455161d1c4d056ae16af3838ea6"} Oct 05 20:29:32 crc kubenswrapper[4753]: I1005 20:29:32.463991 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.106069 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.107782 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.110667 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.110913 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.116053 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-vb6tn" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.116635 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.120763 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.121936 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.131242 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.248982 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.249030 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.249083 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.249124 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-config-data-default\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.249182 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdkqd\" (UniqueName: \"kubernetes.io/projected/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-kube-api-access-pdkqd\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.249207 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-secrets\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.249240 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.249266 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.249522 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-kolla-config\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.351386 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.351428 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.351792 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-kolla-config\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.351738 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.352176 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.352192 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.352249 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.352274 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.352318 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-config-data-default\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.352362 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdkqd\" (UniqueName: \"kubernetes.io/projected/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-kube-api-access-pdkqd\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.352383 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-secrets\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.353326 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-kolla-config\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.355207 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.359625 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-config-data-default\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.381200 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.385965 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.394366 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.395306 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.396569 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.400557 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.401219 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-klkhg" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.418397 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-secrets\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.421377 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.433432 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.441095 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdkqd\" (UniqueName: \"kubernetes.io/projected/81bd134f-0bbd-4cda-b29c-d4d514d4dbe7-kube-api-access-pdkqd\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.451534 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7\") " pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.557300 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.557339 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20016b6-f321-4cf2-b09a-d35b96c85805-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.557369 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c20016b6-f321-4cf2-b09a-d35b96c85805-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.557394 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c20016b6-f321-4cf2-b09a-d35b96c85805-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.557435 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbgq5\" (UniqueName: \"kubernetes.io/projected/c20016b6-f321-4cf2-b09a-d35b96c85805-kube-api-access-fbgq5\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.557456 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c20016b6-f321-4cf2-b09a-d35b96c85805-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.557480 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c20016b6-f321-4cf2-b09a-d35b96c85805-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.557494 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c20016b6-f321-4cf2-b09a-d35b96c85805-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.557514 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c20016b6-f321-4cf2-b09a-d35b96c85805-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.570082 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.571144 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.577945 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.578120 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.584016 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-rrkq6" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.589265 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.659231 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbgq5\" (UniqueName: \"kubernetes.io/projected/c20016b6-f321-4cf2-b09a-d35b96c85805-kube-api-access-fbgq5\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.659284 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c20016b6-f321-4cf2-b09a-d35b96c85805-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.659322 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c20016b6-f321-4cf2-b09a-d35b96c85805-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.659348 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5091b95a-0011-45bb-b4b8-be273f03f7b4-config-data\") pod \"memcached-0\" (UID: \"5091b95a-0011-45bb-b4b8-be273f03f7b4\") " pod="openstack/memcached-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.659369 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c20016b6-f321-4cf2-b09a-d35b96c85805-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.659390 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c20016b6-f321-4cf2-b09a-d35b96c85805-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.659416 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5091b95a-0011-45bb-b4b8-be273f03f7b4-kolla-config\") pod \"memcached-0\" (UID: \"5091b95a-0011-45bb-b4b8-be273f03f7b4\") " pod="openstack/memcached-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.659459 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhh5d\" (UniqueName: \"kubernetes.io/projected/5091b95a-0011-45bb-b4b8-be273f03f7b4-kube-api-access-dhh5d\") pod \"memcached-0\" (UID: \"5091b95a-0011-45bb-b4b8-be273f03f7b4\") " pod="openstack/memcached-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.659484 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5091b95a-0011-45bb-b4b8-be273f03f7b4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5091b95a-0011-45bb-b4b8-be273f03f7b4\") " pod="openstack/memcached-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.659504 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5091b95a-0011-45bb-b4b8-be273f03f7b4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5091b95a-0011-45bb-b4b8-be273f03f7b4\") " pod="openstack/memcached-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.659528 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.659550 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20016b6-f321-4cf2-b09a-d35b96c85805-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.659581 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c20016b6-f321-4cf2-b09a-d35b96c85805-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.659601 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c20016b6-f321-4cf2-b09a-d35b96c85805-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.660865 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c20016b6-f321-4cf2-b09a-d35b96c85805-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.661096 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c20016b6-f321-4cf2-b09a-d35b96c85805-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.661558 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c20016b6-f321-4cf2-b09a-d35b96c85805-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.663191 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c20016b6-f321-4cf2-b09a-d35b96c85805-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.663420 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.664907 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c20016b6-f321-4cf2-b09a-d35b96c85805-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.679354 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20016b6-f321-4cf2-b09a-d35b96c85805-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.679970 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c20016b6-f321-4cf2-b09a-d35b96c85805-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.706414 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.706469 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbgq5\" (UniqueName: \"kubernetes.io/projected/c20016b6-f321-4cf2-b09a-d35b96c85805-kube-api-access-fbgq5\") pod \"openstack-cell1-galera-0\" (UID: \"c20016b6-f321-4cf2-b09a-d35b96c85805\") " pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.747888 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.762126 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5091b95a-0011-45bb-b4b8-be273f03f7b4-config-data\") pod \"memcached-0\" (UID: \"5091b95a-0011-45bb-b4b8-be273f03f7b4\") " pod="openstack/memcached-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.762200 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5091b95a-0011-45bb-b4b8-be273f03f7b4-kolla-config\") pod \"memcached-0\" (UID: \"5091b95a-0011-45bb-b4b8-be273f03f7b4\") " pod="openstack/memcached-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.762234 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhh5d\" (UniqueName: \"kubernetes.io/projected/5091b95a-0011-45bb-b4b8-be273f03f7b4-kube-api-access-dhh5d\") pod \"memcached-0\" (UID: \"5091b95a-0011-45bb-b4b8-be273f03f7b4\") " pod="openstack/memcached-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.762252 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5091b95a-0011-45bb-b4b8-be273f03f7b4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5091b95a-0011-45bb-b4b8-be273f03f7b4\") " pod="openstack/memcached-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.762268 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5091b95a-0011-45bb-b4b8-be273f03f7b4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5091b95a-0011-45bb-b4b8-be273f03f7b4\") " pod="openstack/memcached-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.763258 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5091b95a-0011-45bb-b4b8-be273f03f7b4-kolla-config\") pod \"memcached-0\" (UID: \"5091b95a-0011-45bb-b4b8-be273f03f7b4\") " pod="openstack/memcached-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.764893 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5091b95a-0011-45bb-b4b8-be273f03f7b4-config-data\") pod \"memcached-0\" (UID: \"5091b95a-0011-45bb-b4b8-be273f03f7b4\") " pod="openstack/memcached-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.765548 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5091b95a-0011-45bb-b4b8-be273f03f7b4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5091b95a-0011-45bb-b4b8-be273f03f7b4\") " pod="openstack/memcached-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.767134 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5091b95a-0011-45bb-b4b8-be273f03f7b4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5091b95a-0011-45bb-b4b8-be273f03f7b4\") " pod="openstack/memcached-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.780409 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhh5d\" (UniqueName: \"kubernetes.io/projected/5091b95a-0011-45bb-b4b8-be273f03f7b4-kube-api-access-dhh5d\") pod \"memcached-0\" (UID: \"5091b95a-0011-45bb-b4b8-be273f03f7b4\") " pod="openstack/memcached-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.819679 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 05 20:29:34 crc kubenswrapper[4753]: I1005 20:29:34.891811 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 05 20:29:36 crc kubenswrapper[4753]: I1005 20:29:36.267439 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 05 20:29:36 crc kubenswrapper[4753]: I1005 20:29:36.268390 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 05 20:29:36 crc kubenswrapper[4753]: I1005 20:29:36.273122 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-f2vf8" Oct 05 20:29:36 crc kubenswrapper[4753]: I1005 20:29:36.283818 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 05 20:29:36 crc kubenswrapper[4753]: I1005 20:29:36.440162 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r249z\" (UniqueName: \"kubernetes.io/projected/5d0d97ee-e7c7-4f1c-b232-b6377a0c890f-kube-api-access-r249z\") pod \"kube-state-metrics-0\" (UID: \"5d0d97ee-e7c7-4f1c-b232-b6377a0c890f\") " pod="openstack/kube-state-metrics-0" Oct 05 20:29:36 crc kubenswrapper[4753]: I1005 20:29:36.541632 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r249z\" (UniqueName: \"kubernetes.io/projected/5d0d97ee-e7c7-4f1c-b232-b6377a0c890f-kube-api-access-r249z\") pod \"kube-state-metrics-0\" (UID: \"5d0d97ee-e7c7-4f1c-b232-b6377a0c890f\") " pod="openstack/kube-state-metrics-0" Oct 05 20:29:36 crc kubenswrapper[4753]: I1005 20:29:36.559048 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r249z\" (UniqueName: \"kubernetes.io/projected/5d0d97ee-e7c7-4f1c-b232-b6377a0c890f-kube-api-access-r249z\") pod \"kube-state-metrics-0\" (UID: \"5d0d97ee-e7c7-4f1c-b232-b6377a0c890f\") " pod="openstack/kube-state-metrics-0" Oct 05 20:29:36 crc kubenswrapper[4753]: I1005 20:29:36.640492 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 05 20:29:39 crc kubenswrapper[4753]: W1005 20:29:39.598579 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f77d64e_7c7a_4770_9710_8c4aa767bcfa.slice/crio-b7b9aaa4bf46c8f8ebd0e0b759f81cb9cf2fe3de6a597a5ca9e34c99552ceb43 WatchSource:0}: Error finding container b7b9aaa4bf46c8f8ebd0e0b759f81cb9cf2fe3de6a597a5ca9e34c99552ceb43: Status 404 returned error can't find the container with id b7b9aaa4bf46c8f8ebd0e0b759f81cb9cf2fe3de6a597a5ca9e34c99552ceb43 Oct 05 20:29:40 crc kubenswrapper[4753]: I1005 20:29:40.333300 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"73d182e9-8e4b-46ce-aa0b-6fd751eefecd","Type":"ContainerStarted","Data":"f10e1725cc723c29ef4bd7b39b13680e39522e023e3435dbc375a368b642b94c"} Oct 05 20:29:40 crc kubenswrapper[4753]: I1005 20:29:40.334747 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f77d64e-7c7a-4770-9710-8c4aa767bcfa","Type":"ContainerStarted","Data":"b7b9aaa4bf46c8f8ebd0e0b759f81cb9cf2fe3de6a597a5ca9e34c99552ceb43"} Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.136122 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.137653 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.139398 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.139630 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.143292 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.143412 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.147424 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-5dbrz" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.161084 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.241711 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.241755 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/19914a64-715e-4a20-82fc-f4e86b8e9e21-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.241787 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/19914a64-715e-4a20-82fc-f4e86b8e9e21-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.241817 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/19914a64-715e-4a20-82fc-f4e86b8e9e21-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.241839 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19914a64-715e-4a20-82fc-f4e86b8e9e21-config\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.241859 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19914a64-715e-4a20-82fc-f4e86b8e9e21-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.241879 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19914a64-715e-4a20-82fc-f4e86b8e9e21-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.241894 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zqtg\" (UniqueName: \"kubernetes.io/projected/19914a64-715e-4a20-82fc-f4e86b8e9e21-kube-api-access-4zqtg\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.343098 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/19914a64-715e-4a20-82fc-f4e86b8e9e21-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.343182 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/19914a64-715e-4a20-82fc-f4e86b8e9e21-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.343205 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19914a64-715e-4a20-82fc-f4e86b8e9e21-config\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.343226 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19914a64-715e-4a20-82fc-f4e86b8e9e21-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.343247 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19914a64-715e-4a20-82fc-f4e86b8e9e21-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.343263 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zqtg\" (UniqueName: \"kubernetes.io/projected/19914a64-715e-4a20-82fc-f4e86b8e9e21-kube-api-access-4zqtg\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.343334 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.343367 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/19914a64-715e-4a20-82fc-f4e86b8e9e21-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.345121 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19914a64-715e-4a20-82fc-f4e86b8e9e21-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.345441 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/19914a64-715e-4a20-82fc-f4e86b8e9e21-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.348849 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19914a64-715e-4a20-82fc-f4e86b8e9e21-config\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.349446 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.367713 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/19914a64-715e-4a20-82fc-f4e86b8e9e21-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.377058 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/19914a64-715e-4a20-82fc-f4e86b8e9e21-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.378343 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.388266 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zqtg\" (UniqueName: \"kubernetes.io/projected/19914a64-715e-4a20-82fc-f4e86b8e9e21-kube-api-access-4zqtg\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.389105 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19914a64-715e-4a20-82fc-f4e86b8e9e21-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"19914a64-715e-4a20-82fc-f4e86b8e9e21\") " pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.400324 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7zxq7"] Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.402392 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.407909 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-xt7ck" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.408334 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.408479 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.425302 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7zxq7"] Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.447386 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x79j4\" (UniqueName: \"kubernetes.io/projected/61f845cb-9404-421b-b20f-9dee4edd00f8-kube-api-access-x79j4\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.447454 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/61f845cb-9404-421b-b20f-9dee4edd00f8-var-run-ovn\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.447489 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/61f845cb-9404-421b-b20f-9dee4edd00f8-var-log-ovn\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.447568 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f845cb-9404-421b-b20f-9dee4edd00f8-combined-ca-bundle\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.447587 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/61f845cb-9404-421b-b20f-9dee4edd00f8-var-run\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.447613 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61f845cb-9404-421b-b20f-9dee4edd00f8-scripts\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.447639 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f845cb-9404-421b-b20f-9dee4edd00f8-ovn-controller-tls-certs\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.462819 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.468650 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-8krg4"] Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.476414 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8krg4"] Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.477984 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.548372 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-var-run\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.548432 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-scripts\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.548493 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-var-log\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.548543 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f845cb-9404-421b-b20f-9dee4edd00f8-combined-ca-bundle\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.548564 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-var-lib\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.548949 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/61f845cb-9404-421b-b20f-9dee4edd00f8-var-run\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.548982 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61f845cb-9404-421b-b20f-9dee4edd00f8-scripts\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.549004 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f845cb-9404-421b-b20f-9dee4edd00f8-ovn-controller-tls-certs\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.549026 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4kxt\" (UniqueName: \"kubernetes.io/projected/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-kube-api-access-l4kxt\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.549059 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x79j4\" (UniqueName: \"kubernetes.io/projected/61f845cb-9404-421b-b20f-9dee4edd00f8-kube-api-access-x79j4\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.549076 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-etc-ovs\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.549100 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/61f845cb-9404-421b-b20f-9dee4edd00f8-var-run-ovn\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.549116 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/61f845cb-9404-421b-b20f-9dee4edd00f8-var-log-ovn\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.549465 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/61f845cb-9404-421b-b20f-9dee4edd00f8-var-run\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.549482 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/61f845cb-9404-421b-b20f-9dee4edd00f8-var-log-ovn\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.549558 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/61f845cb-9404-421b-b20f-9dee4edd00f8-var-run-ovn\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.550899 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61f845cb-9404-421b-b20f-9dee4edd00f8-scripts\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.554604 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f845cb-9404-421b-b20f-9dee4edd00f8-combined-ca-bundle\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.567037 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x79j4\" (UniqueName: \"kubernetes.io/projected/61f845cb-9404-421b-b20f-9dee4edd00f8-kube-api-access-x79j4\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.569571 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f845cb-9404-421b-b20f-9dee4edd00f8-ovn-controller-tls-certs\") pod \"ovn-controller-7zxq7\" (UID: \"61f845cb-9404-421b-b20f-9dee4edd00f8\") " pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.650783 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-etc-ovs\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.650894 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-var-run\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.650913 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-scripts\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.650940 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-var-log\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.650960 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-var-lib\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.650980 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-var-run\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.651001 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4kxt\" (UniqueName: \"kubernetes.io/projected/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-kube-api-access-l4kxt\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.651074 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-etc-ovs\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.651238 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-var-lib\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.651313 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-var-log\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.652861 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-scripts\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.671754 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4kxt\" (UniqueName: \"kubernetes.io/projected/61cd2b9f-f08f-4b47-be04-1d9246a5cbdb-kube-api-access-l4kxt\") pod \"ovn-controller-ovs-8krg4\" (UID: \"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb\") " pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.727333 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7zxq7" Oct 05 20:29:41 crc kubenswrapper[4753]: I1005 20:29:41.798684 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.192356 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.193848 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.196286 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.196435 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-wffq9" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.203708 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.204515 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.221965 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.295094 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.295179 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjwm7\" (UniqueName: \"kubernetes.io/projected/396e74f9-75f0-4643-a011-da8c56174984-kube-api-access-bjwm7\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.295212 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/396e74f9-75f0-4643-a011-da8c56174984-config\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.295842 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/396e74f9-75f0-4643-a011-da8c56174984-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.295907 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/396e74f9-75f0-4643-a011-da8c56174984-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.295925 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/396e74f9-75f0-4643-a011-da8c56174984-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.295954 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/396e74f9-75f0-4643-a011-da8c56174984-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.295976 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396e74f9-75f0-4643-a011-da8c56174984-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.397783 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/396e74f9-75f0-4643-a011-da8c56174984-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.397866 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/396e74f9-75f0-4643-a011-da8c56174984-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.397889 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/396e74f9-75f0-4643-a011-da8c56174984-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.397920 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/396e74f9-75f0-4643-a011-da8c56174984-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.397940 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396e74f9-75f0-4643-a011-da8c56174984-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.397963 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.397987 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjwm7\" (UniqueName: \"kubernetes.io/projected/396e74f9-75f0-4643-a011-da8c56174984-kube-api-access-bjwm7\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.398003 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/396e74f9-75f0-4643-a011-da8c56174984-config\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.399172 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/396e74f9-75f0-4643-a011-da8c56174984-config\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.399444 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/396e74f9-75f0-4643-a011-da8c56174984-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.403169 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/396e74f9-75f0-4643-a011-da8c56174984-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.404640 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.407342 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/396e74f9-75f0-4643-a011-da8c56174984-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.417637 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/396e74f9-75f0-4643-a011-da8c56174984-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.418794 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396e74f9-75f0-4643-a011-da8c56174984-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.421816 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjwm7\" (UniqueName: \"kubernetes.io/projected/396e74f9-75f0-4643-a011-da8c56174984-kube-api-access-bjwm7\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.426085 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"396e74f9-75f0-4643-a011-da8c56174984\") " pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:43 crc kubenswrapper[4753]: I1005 20:29:43.522640 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 05 20:29:48 crc kubenswrapper[4753]: E1005 20:29:48.327819 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:053c95cc75e5bc6de83a08f3196125bb5fbbfea1795643daf3f1378cbaad5d26" Oct 05 20:29:48 crc kubenswrapper[4753]: E1005 20:29:48.328840 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:053c95cc75e5bc6de83a08f3196125bb5fbbfea1795643daf3f1378cbaad5d26,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ppwgq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7f9579fb85-q6s4w_openstack(736a2667-a356-4164-ad9c-80eb9b2d87ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 20:29:48 crc kubenswrapper[4753]: E1005 20:29:48.330223 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7f9579fb85-q6s4w" podUID="736a2667-a356-4164-ad9c-80eb9b2d87ea" Oct 05 20:29:48 crc kubenswrapper[4753]: E1005 20:29:48.378328 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:053c95cc75e5bc6de83a08f3196125bb5fbbfea1795643daf3f1378cbaad5d26" Oct 05 20:29:48 crc kubenswrapper[4753]: E1005 20:29:48.378449 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:053c95cc75e5bc6de83a08f3196125bb5fbbfea1795643daf3f1378cbaad5d26,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mhmv9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-bf999f689-d4wgh_openstack(867c0775-f6f9-4fae-ba12-70c68813f8eb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 20:29:48 crc kubenswrapper[4753]: E1005 20:29:48.379581 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-bf999f689-d4wgh" podUID="867c0775-f6f9-4fae-ba12-70c68813f8eb" Oct 05 20:29:48 crc kubenswrapper[4753]: E1005 20:29:48.379671 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:053c95cc75e5bc6de83a08f3196125bb5fbbfea1795643daf3f1378cbaad5d26" Oct 05 20:29:48 crc kubenswrapper[4753]: E1005 20:29:48.379766 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:053c95cc75e5bc6de83a08f3196125bb5fbbfea1795643daf3f1378cbaad5d26,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ppj4b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-546d69f86c-vvp27_openstack(9d6ca1dc-5308-4633-bf0e-ce97c2887028): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 20:29:48 crc kubenswrapper[4753]: E1005 20:29:48.383577 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-546d69f86c-vvp27" podUID="9d6ca1dc-5308-4633-bf0e-ce97c2887028" Oct 05 20:29:48 crc kubenswrapper[4753]: E1005 20:29:48.443979 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:053c95cc75e5bc6de83a08f3196125bb5fbbfea1795643daf3f1378cbaad5d26\\\"\"" pod="openstack/dnsmasq-dns-bf999f689-d4wgh" podUID="867c0775-f6f9-4fae-ba12-70c68813f8eb" Oct 05 20:29:48 crc kubenswrapper[4753]: I1005 20:29:48.807662 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 05 20:29:49 crc kubenswrapper[4753]: I1005 20:29:49.778295 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9579fb85-q6s4w" Oct 05 20:29:49 crc kubenswrapper[4753]: I1005 20:29:49.808792 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546d69f86c-vvp27" Oct 05 20:29:49 crc kubenswrapper[4753]: I1005 20:29:49.904817 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppj4b\" (UniqueName: \"kubernetes.io/projected/9d6ca1dc-5308-4633-bf0e-ce97c2887028-kube-api-access-ppj4b\") pod \"9d6ca1dc-5308-4633-bf0e-ce97c2887028\" (UID: \"9d6ca1dc-5308-4633-bf0e-ce97c2887028\") " Oct 05 20:29:49 crc kubenswrapper[4753]: I1005 20:29:49.904861 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppwgq\" (UniqueName: \"kubernetes.io/projected/736a2667-a356-4164-ad9c-80eb9b2d87ea-kube-api-access-ppwgq\") pod \"736a2667-a356-4164-ad9c-80eb9b2d87ea\" (UID: \"736a2667-a356-4164-ad9c-80eb9b2d87ea\") " Oct 05 20:29:49 crc kubenswrapper[4753]: I1005 20:29:49.904949 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6ca1dc-5308-4633-bf0e-ce97c2887028-config\") pod \"9d6ca1dc-5308-4633-bf0e-ce97c2887028\" (UID: \"9d6ca1dc-5308-4633-bf0e-ce97c2887028\") " Oct 05 20:29:49 crc kubenswrapper[4753]: I1005 20:29:49.904999 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/736a2667-a356-4164-ad9c-80eb9b2d87ea-dns-svc\") pod \"736a2667-a356-4164-ad9c-80eb9b2d87ea\" (UID: \"736a2667-a356-4164-ad9c-80eb9b2d87ea\") " Oct 05 20:29:49 crc kubenswrapper[4753]: I1005 20:29:49.905013 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736a2667-a356-4164-ad9c-80eb9b2d87ea-config\") pod \"736a2667-a356-4164-ad9c-80eb9b2d87ea\" (UID: \"736a2667-a356-4164-ad9c-80eb9b2d87ea\") " Oct 05 20:29:49 crc kubenswrapper[4753]: I1005 20:29:49.905684 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736a2667-a356-4164-ad9c-80eb9b2d87ea-config" (OuterVolumeSpecName: "config") pod "736a2667-a356-4164-ad9c-80eb9b2d87ea" (UID: "736a2667-a356-4164-ad9c-80eb9b2d87ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:29:49 crc kubenswrapper[4753]: I1005 20:29:49.906981 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736a2667-a356-4164-ad9c-80eb9b2d87ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "736a2667-a356-4164-ad9c-80eb9b2d87ea" (UID: "736a2667-a356-4164-ad9c-80eb9b2d87ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:29:49 crc kubenswrapper[4753]: I1005 20:29:49.907439 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6ca1dc-5308-4633-bf0e-ce97c2887028-config" (OuterVolumeSpecName: "config") pod "9d6ca1dc-5308-4633-bf0e-ce97c2887028" (UID: "9d6ca1dc-5308-4633-bf0e-ce97c2887028"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:29:49 crc kubenswrapper[4753]: I1005 20:29:49.910358 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d6ca1dc-5308-4633-bf0e-ce97c2887028-kube-api-access-ppj4b" (OuterVolumeSpecName: "kube-api-access-ppj4b") pod "9d6ca1dc-5308-4633-bf0e-ce97c2887028" (UID: "9d6ca1dc-5308-4633-bf0e-ce97c2887028"). InnerVolumeSpecName "kube-api-access-ppj4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:29:49 crc kubenswrapper[4753]: I1005 20:29:49.911279 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736a2667-a356-4164-ad9c-80eb9b2d87ea-kube-api-access-ppwgq" (OuterVolumeSpecName: "kube-api-access-ppwgq") pod "736a2667-a356-4164-ad9c-80eb9b2d87ea" (UID: "736a2667-a356-4164-ad9c-80eb9b2d87ea"). InnerVolumeSpecName "kube-api-access-ppwgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.007280 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppwgq\" (UniqueName: \"kubernetes.io/projected/736a2667-a356-4164-ad9c-80eb9b2d87ea-kube-api-access-ppwgq\") on node \"crc\" DevicePath \"\"" Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.007312 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6ca1dc-5308-4633-bf0e-ce97c2887028-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.007324 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/736a2667-a356-4164-ad9c-80eb9b2d87ea-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.007333 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736a2667-a356-4164-ad9c-80eb9b2d87ea-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.007351 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppj4b\" (UniqueName: \"kubernetes.io/projected/9d6ca1dc-5308-4633-bf0e-ce97c2887028-kube-api-access-ppj4b\") on node \"crc\" DevicePath \"\"" Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.112444 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.212243 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 05 20:29:50 crc kubenswrapper[4753]: W1005 20:29:50.220704 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d0d97ee_e7c7_4f1c_b232_b6377a0c890f.slice/crio-c26c915c793fecef25e34620e16738fe40c9f879a44b5508ab01260d3b2869f0 WatchSource:0}: Error finding container c26c915c793fecef25e34620e16738fe40c9f879a44b5508ab01260d3b2869f0: Status 404 returned error can't find the container with id c26c915c793fecef25e34620e16738fe40c9f879a44b5508ab01260d3b2869f0 Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.369416 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7zxq7"] Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.432899 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 05 20:29:50 crc kubenswrapper[4753]: W1005 20:29:50.444466 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61f845cb_9404_421b_b20f_9dee4edd00f8.slice/crio-9d1e8846787d858cb7a5f69bf369bbf2bf550bd7088f9175cadc22d70fdf0398 WatchSource:0}: Error finding container 9d1e8846787d858cb7a5f69bf369bbf2bf550bd7088f9175cadc22d70fdf0398: Status 404 returned error can't find the container with id 9d1e8846787d858cb7a5f69bf369bbf2bf550bd7088f9175cadc22d70fdf0398 Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.452765 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9579fb85-q6s4w" Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.452763 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9579fb85-q6s4w" event={"ID":"736a2667-a356-4164-ad9c-80eb9b2d87ea","Type":"ContainerDied","Data":"be5296592ce8312f1293242a2904257d69a1a4da59985d30583c2d07038a6811"} Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.454723 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7zxq7" event={"ID":"61f845cb-9404-421b-b20f-9dee4edd00f8","Type":"ContainerStarted","Data":"9d1e8846787d858cb7a5f69bf369bbf2bf550bd7088f9175cadc22d70fdf0398"} Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.458767 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7","Type":"ContainerStarted","Data":"257ea9acee38d016050723a0bc52f5974cb91009946dcb5911339b4ad77cfc1c"} Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.460830 4753 generic.go:334] "Generic (PLEG): container finished" podID="c0adcc08-1f43-48d5-936d-83797872bb43" containerID="b5a1ef213ffa6ddec609c2f2dfcc8bc5e7a2343793097bbd1c3cf2827a5fe6f1" exitCode=0 Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.460872 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" event={"ID":"c0adcc08-1f43-48d5-936d-83797872bb43","Type":"ContainerDied","Data":"b5a1ef213ffa6ddec609c2f2dfcc8bc5e7a2343793097bbd1c3cf2827a5fe6f1"} Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.463292 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c20016b6-f321-4cf2-b09a-d35b96c85805","Type":"ContainerStarted","Data":"6b05d4de7a6eb5d873ee53f3fb8c9a6e788f2d2d6ec48d9b56ecdb5eb0eeb6dd"} Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.464271 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546d69f86c-vvp27" event={"ID":"9d6ca1dc-5308-4633-bf0e-ce97c2887028","Type":"ContainerDied","Data":"803394cc9d5d0358280ef2a03bca085705afd8240da3a4ea41a0aa7ce314954c"} Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.464327 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546d69f86c-vvp27" Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.470207 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5d0d97ee-e7c7-4f1c-b232-b6377a0c890f","Type":"ContainerStarted","Data":"c26c915c793fecef25e34620e16738fe40c9f879a44b5508ab01260d3b2869f0"} Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.535940 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f9579fb85-q6s4w"] Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.541636 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f9579fb85-q6s4w"] Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.571663 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-546d69f86c-vvp27"] Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.577510 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-546d69f86c-vvp27"] Oct 05 20:29:50 crc kubenswrapper[4753]: I1005 20:29:50.595278 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 05 20:29:51 crc kubenswrapper[4753]: I1005 20:29:51.178670 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 05 20:29:51 crc kubenswrapper[4753]: I1005 20:29:51.430441 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8krg4"] Oct 05 20:29:51 crc kubenswrapper[4753]: I1005 20:29:51.478176 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f77d64e-7c7a-4770-9710-8c4aa767bcfa","Type":"ContainerStarted","Data":"5147e49072301e23beb6f2459787fe2e686175dda9c52f176ca87e213655d772"} Oct 05 20:29:51 crc kubenswrapper[4753]: I1005 20:29:51.481355 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"19914a64-715e-4a20-82fc-f4e86b8e9e21","Type":"ContainerStarted","Data":"4f65ca1f79534c25a1ce414c183a85844dca2aa64ce026dbfeb35d19c1b2082a"} Oct 05 20:29:51 crc kubenswrapper[4753]: I1005 20:29:51.483498 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"73d182e9-8e4b-46ce-aa0b-6fd751eefecd","Type":"ContainerStarted","Data":"a730afd36443ffc2b80abbc60e2cca6017825e03868c24ebe4bb442fedc5ae24"} Oct 05 20:29:51 crc kubenswrapper[4753]: I1005 20:29:51.485127 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" event={"ID":"c0adcc08-1f43-48d5-936d-83797872bb43","Type":"ContainerStarted","Data":"7b348c9b0f4c9037701ad7dbf03f361e6b34eae12a474f170e03f2a2d8ee5a6c"} Oct 05 20:29:51 crc kubenswrapper[4753]: I1005 20:29:51.485382 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" Oct 05 20:29:51 crc kubenswrapper[4753]: I1005 20:29:51.490393 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5091b95a-0011-45bb-b4b8-be273f03f7b4","Type":"ContainerStarted","Data":"cbe0ee1817958819e0e718844d88b0d4f0ea214f849a610e1eb27d8cd5ba37d3"} Oct 05 20:29:51 crc kubenswrapper[4753]: I1005 20:29:51.523013 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" podStartSLOduration=3.262860205 podStartE2EDuration="21.522988013s" podCreationTimestamp="2025-10-05 20:29:30 +0000 UTC" firstStartedPulling="2025-10-05 20:29:31.391062338 +0000 UTC m=+880.239390560" lastFinishedPulling="2025-10-05 20:29:49.651190136 +0000 UTC m=+898.499518368" observedRunningTime="2025-10-05 20:29:51.517738641 +0000 UTC m=+900.366066873" watchObservedRunningTime="2025-10-05 20:29:51.522988013 +0000 UTC m=+900.371316245" Oct 05 20:29:51 crc kubenswrapper[4753]: W1005 20:29:51.785123 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61cd2b9f_f08f_4b47_be04_1d9246a5cbdb.slice/crio-44a1e14e667fe9f878db6f4f5d327b2a4c9b1337927ecd367023545f82098a85 WatchSource:0}: Error finding container 44a1e14e667fe9f878db6f4f5d327b2a4c9b1337927ecd367023545f82098a85: Status 404 returned error can't find the container with id 44a1e14e667fe9f878db6f4f5d327b2a4c9b1337927ecd367023545f82098a85 Oct 05 20:29:51 crc kubenswrapper[4753]: I1005 20:29:51.893545 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736a2667-a356-4164-ad9c-80eb9b2d87ea" path="/var/lib/kubelet/pods/736a2667-a356-4164-ad9c-80eb9b2d87ea/volumes" Oct 05 20:29:51 crc kubenswrapper[4753]: I1005 20:29:51.893902 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d6ca1dc-5308-4633-bf0e-ce97c2887028" path="/var/lib/kubelet/pods/9d6ca1dc-5308-4633-bf0e-ce97c2887028/volumes" Oct 05 20:29:52 crc kubenswrapper[4753]: I1005 20:29:52.504796 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"396e74f9-75f0-4643-a011-da8c56174984","Type":"ContainerStarted","Data":"8a3dad89f276453865277de84a69f5306b817c4ab9cd8916b94b6e4fa003eeab"} Oct 05 20:29:52 crc kubenswrapper[4753]: I1005 20:29:52.506656 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8krg4" event={"ID":"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb","Type":"ContainerStarted","Data":"44a1e14e667fe9f878db6f4f5d327b2a4c9b1337927ecd367023545f82098a85"} Oct 05 20:29:55 crc kubenswrapper[4753]: I1005 20:29:55.720338 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" Oct 05 20:29:55 crc kubenswrapper[4753]: I1005 20:29:55.777601 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf999f689-d4wgh"] Oct 05 20:29:57 crc kubenswrapper[4753]: I1005 20:29:57.178517 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf999f689-d4wgh" Oct 05 20:29:57 crc kubenswrapper[4753]: I1005 20:29:57.245276 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/867c0775-f6f9-4fae-ba12-70c68813f8eb-dns-svc\") pod \"867c0775-f6f9-4fae-ba12-70c68813f8eb\" (UID: \"867c0775-f6f9-4fae-ba12-70c68813f8eb\") " Oct 05 20:29:57 crc kubenswrapper[4753]: I1005 20:29:57.245366 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhmv9\" (UniqueName: \"kubernetes.io/projected/867c0775-f6f9-4fae-ba12-70c68813f8eb-kube-api-access-mhmv9\") pod \"867c0775-f6f9-4fae-ba12-70c68813f8eb\" (UID: \"867c0775-f6f9-4fae-ba12-70c68813f8eb\") " Oct 05 20:29:57 crc kubenswrapper[4753]: I1005 20:29:57.245495 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/867c0775-f6f9-4fae-ba12-70c68813f8eb-config\") pod \"867c0775-f6f9-4fae-ba12-70c68813f8eb\" (UID: \"867c0775-f6f9-4fae-ba12-70c68813f8eb\") " Oct 05 20:29:57 crc kubenswrapper[4753]: I1005 20:29:57.246065 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/867c0775-f6f9-4fae-ba12-70c68813f8eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "867c0775-f6f9-4fae-ba12-70c68813f8eb" (UID: "867c0775-f6f9-4fae-ba12-70c68813f8eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:29:57 crc kubenswrapper[4753]: I1005 20:29:57.246092 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/867c0775-f6f9-4fae-ba12-70c68813f8eb-config" (OuterVolumeSpecName: "config") pod "867c0775-f6f9-4fae-ba12-70c68813f8eb" (UID: "867c0775-f6f9-4fae-ba12-70c68813f8eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:29:57 crc kubenswrapper[4753]: I1005 20:29:57.251912 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/867c0775-f6f9-4fae-ba12-70c68813f8eb-kube-api-access-mhmv9" (OuterVolumeSpecName: "kube-api-access-mhmv9") pod "867c0775-f6f9-4fae-ba12-70c68813f8eb" (UID: "867c0775-f6f9-4fae-ba12-70c68813f8eb"). InnerVolumeSpecName "kube-api-access-mhmv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:29:57 crc kubenswrapper[4753]: I1005 20:29:57.347493 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/867c0775-f6f9-4fae-ba12-70c68813f8eb-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:29:57 crc kubenswrapper[4753]: I1005 20:29:57.347525 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/867c0775-f6f9-4fae-ba12-70c68813f8eb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 05 20:29:57 crc kubenswrapper[4753]: I1005 20:29:57.347537 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhmv9\" (UniqueName: \"kubernetes.io/projected/867c0775-f6f9-4fae-ba12-70c68813f8eb-kube-api-access-mhmv9\") on node \"crc\" DevicePath \"\"" Oct 05 20:29:57 crc kubenswrapper[4753]: I1005 20:29:57.549342 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf999f689-d4wgh" event={"ID":"867c0775-f6f9-4fae-ba12-70c68813f8eb","Type":"ContainerDied","Data":"f75f7c696cdec87f4807184227f0f5e5533448282473aecd762edf3a80d2890b"} Oct 05 20:29:57 crc kubenswrapper[4753]: I1005 20:29:57.549400 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf999f689-d4wgh" Oct 05 20:29:57 crc kubenswrapper[4753]: I1005 20:29:57.602953 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf999f689-d4wgh"] Oct 05 20:29:57 crc kubenswrapper[4753]: I1005 20:29:57.607415 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bf999f689-d4wgh"] Oct 05 20:29:57 crc kubenswrapper[4753]: I1005 20:29:57.862384 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="867c0775-f6f9-4fae-ba12-70c68813f8eb" path="/var/lib/kubelet/pods/867c0775-f6f9-4fae-ba12-70c68813f8eb/volumes" Oct 05 20:30:00 crc kubenswrapper[4753]: I1005 20:30:00.139634 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99"] Oct 05 20:30:00 crc kubenswrapper[4753]: I1005 20:30:00.140908 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99" Oct 05 20:30:00 crc kubenswrapper[4753]: I1005 20:30:00.146078 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 05 20:30:00 crc kubenswrapper[4753]: I1005 20:30:00.146258 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 05 20:30:00 crc kubenswrapper[4753]: I1005 20:30:00.155251 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99"] Oct 05 20:30:00 crc kubenswrapper[4753]: I1005 20:30:00.209310 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f15f854-d530-42ad-a821-537981a408e3-secret-volume\") pod \"collect-profiles-29328270-p4s99\" (UID: \"3f15f854-d530-42ad-a821-537981a408e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99" Oct 05 20:30:00 crc kubenswrapper[4753]: I1005 20:30:00.209380 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ffnz\" (UniqueName: \"kubernetes.io/projected/3f15f854-d530-42ad-a821-537981a408e3-kube-api-access-6ffnz\") pod \"collect-profiles-29328270-p4s99\" (UID: \"3f15f854-d530-42ad-a821-537981a408e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99" Oct 05 20:30:00 crc kubenswrapper[4753]: I1005 20:30:00.209573 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f15f854-d530-42ad-a821-537981a408e3-config-volume\") pod \"collect-profiles-29328270-p4s99\" (UID: \"3f15f854-d530-42ad-a821-537981a408e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99" Oct 05 20:30:00 crc kubenswrapper[4753]: I1005 20:30:00.311548 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f15f854-d530-42ad-a821-537981a408e3-secret-volume\") pod \"collect-profiles-29328270-p4s99\" (UID: \"3f15f854-d530-42ad-a821-537981a408e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99" Oct 05 20:30:00 crc kubenswrapper[4753]: I1005 20:30:00.311608 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ffnz\" (UniqueName: \"kubernetes.io/projected/3f15f854-d530-42ad-a821-537981a408e3-kube-api-access-6ffnz\") pod \"collect-profiles-29328270-p4s99\" (UID: \"3f15f854-d530-42ad-a821-537981a408e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99" Oct 05 20:30:00 crc kubenswrapper[4753]: I1005 20:30:00.311649 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f15f854-d530-42ad-a821-537981a408e3-config-volume\") pod \"collect-profiles-29328270-p4s99\" (UID: \"3f15f854-d530-42ad-a821-537981a408e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99" Oct 05 20:30:00 crc kubenswrapper[4753]: I1005 20:30:00.312794 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f15f854-d530-42ad-a821-537981a408e3-config-volume\") pod \"collect-profiles-29328270-p4s99\" (UID: \"3f15f854-d530-42ad-a821-537981a408e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99" Oct 05 20:30:00 crc kubenswrapper[4753]: I1005 20:30:00.319286 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f15f854-d530-42ad-a821-537981a408e3-secret-volume\") pod \"collect-profiles-29328270-p4s99\" (UID: \"3f15f854-d530-42ad-a821-537981a408e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99" Oct 05 20:30:00 crc kubenswrapper[4753]: I1005 20:30:00.327634 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ffnz\" (UniqueName: \"kubernetes.io/projected/3f15f854-d530-42ad-a821-537981a408e3-kube-api-access-6ffnz\") pod \"collect-profiles-29328270-p4s99\" (UID: \"3f15f854-d530-42ad-a821-537981a408e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99" Oct 05 20:30:00 crc kubenswrapper[4753]: I1005 20:30:00.467894 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99" Oct 05 20:30:01 crc kubenswrapper[4753]: I1005 20:30:01.507242 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99"] Oct 05 20:30:01 crc kubenswrapper[4753]: I1005 20:30:01.600593 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7","Type":"ContainerStarted","Data":"bcf596f51693a845b4c689285dfa23f460fcb3429b8f3f6d6713e26b80b7625c"} Oct 05 20:30:01 crc kubenswrapper[4753]: I1005 20:30:01.605350 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c20016b6-f321-4cf2-b09a-d35b96c85805","Type":"ContainerStarted","Data":"9e6a754850d246b37623dca99c71468c99be8b73ea7300884c9d84f7c4845153"} Oct 05 20:30:01 crc kubenswrapper[4753]: I1005 20:30:01.607770 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8krg4" event={"ID":"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb","Type":"ContainerStarted","Data":"041c02010b81df57650da0c7e94edc80c1baaf43df6b9ee65f3bbc59838ad04b"} Oct 05 20:30:01 crc kubenswrapper[4753]: I1005 20:30:01.617041 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5d0d97ee-e7c7-4f1c-b232-b6377a0c890f","Type":"ContainerStarted","Data":"f96fabe284eb93f7b4e6d91af5e8dfce1b504522093955701a5a1f520c479d1f"} Oct 05 20:30:01 crc kubenswrapper[4753]: I1005 20:30:01.617180 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 05 20:30:01 crc kubenswrapper[4753]: I1005 20:30:01.632754 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"19914a64-715e-4a20-82fc-f4e86b8e9e21","Type":"ContainerStarted","Data":"44cfc11dcea38b67cad0ca9468234e2198ec803541d284b00570036fdce5f4b8"} Oct 05 20:30:01 crc kubenswrapper[4753]: I1005 20:30:01.634277 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99" event={"ID":"3f15f854-d530-42ad-a821-537981a408e3","Type":"ContainerStarted","Data":"ef4358be0409abaf6801eeeffbe05e2078b63a547eeded5e4a8b0ebd2ef2dd4f"} Oct 05 20:30:01 crc kubenswrapper[4753]: I1005 20:30:01.637337 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"396e74f9-75f0-4643-a011-da8c56174984","Type":"ContainerStarted","Data":"8e1a681b7b7f072a5826008d725727017b860cd0728778a9b3705f6bad6b744d"} Oct 05 20:30:01 crc kubenswrapper[4753]: I1005 20:30:01.643456 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5091b95a-0011-45bb-b4b8-be273f03f7b4","Type":"ContainerStarted","Data":"f255565147fb0b0e665d9e68b5a93400c7476e5388726a8d253afdef69ef1a93"} Oct 05 20:30:01 crc kubenswrapper[4753]: I1005 20:30:01.643624 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 05 20:30:01 crc kubenswrapper[4753]: I1005 20:30:01.660728 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=14.801258314 podStartE2EDuration="25.660706974s" podCreationTimestamp="2025-10-05 20:29:36 +0000 UTC" firstStartedPulling="2025-10-05 20:29:50.223009153 +0000 UTC m=+899.071337385" lastFinishedPulling="2025-10-05 20:30:01.082457813 +0000 UTC m=+909.930786045" observedRunningTime="2025-10-05 20:30:01.654464779 +0000 UTC m=+910.502793021" watchObservedRunningTime="2025-10-05 20:30:01.660706974 +0000 UTC m=+910.509035206" Oct 05 20:30:01 crc kubenswrapper[4753]: I1005 20:30:01.711108 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=17.204624022 podStartE2EDuration="27.711090388s" podCreationTimestamp="2025-10-05 20:29:34 +0000 UTC" firstStartedPulling="2025-10-05 20:29:50.566370607 +0000 UTC m=+899.414698849" lastFinishedPulling="2025-10-05 20:30:01.072836973 +0000 UTC m=+909.921165215" observedRunningTime="2025-10-05 20:30:01.701636353 +0000 UTC m=+910.549964585" watchObservedRunningTime="2025-10-05 20:30:01.711090388 +0000 UTC m=+910.559418620" Oct 05 20:30:02 crc kubenswrapper[4753]: I1005 20:30:02.662174 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7zxq7" event={"ID":"61f845cb-9404-421b-b20f-9dee4edd00f8","Type":"ContainerStarted","Data":"ac3b5a11cefcb2b8e09343b16f93720f952bba7c72253aaf43c075d32af548a3"} Oct 05 20:30:02 crc kubenswrapper[4753]: I1005 20:30:02.662572 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-7zxq7" Oct 05 20:30:02 crc kubenswrapper[4753]: I1005 20:30:02.666447 4753 generic.go:334] "Generic (PLEG): container finished" podID="3f15f854-d530-42ad-a821-537981a408e3" containerID="98a7c79a6806a609d3663be2d9b211ac912550218710d03e40ec5f75ceccedb4" exitCode=0 Oct 05 20:30:02 crc kubenswrapper[4753]: I1005 20:30:02.666487 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99" event={"ID":"3f15f854-d530-42ad-a821-537981a408e3","Type":"ContainerDied","Data":"98a7c79a6806a609d3663be2d9b211ac912550218710d03e40ec5f75ceccedb4"} Oct 05 20:30:02 crc kubenswrapper[4753]: I1005 20:30:02.674695 4753 generic.go:334] "Generic (PLEG): container finished" podID="61cd2b9f-f08f-4b47-be04-1d9246a5cbdb" containerID="041c02010b81df57650da0c7e94edc80c1baaf43df6b9ee65f3bbc59838ad04b" exitCode=0 Oct 05 20:30:02 crc kubenswrapper[4753]: I1005 20:30:02.676738 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8krg4" event={"ID":"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb","Type":"ContainerDied","Data":"041c02010b81df57650da0c7e94edc80c1baaf43df6b9ee65f3bbc59838ad04b"} Oct 05 20:30:02 crc kubenswrapper[4753]: I1005 20:30:02.692919 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-7zxq7" podStartSLOduration=11.048981558 podStartE2EDuration="21.692901519s" podCreationTimestamp="2025-10-05 20:29:41 +0000 UTC" firstStartedPulling="2025-10-05 20:29:50.447313716 +0000 UTC m=+899.295641948" lastFinishedPulling="2025-10-05 20:30:01.091233657 +0000 UTC m=+909.939561909" observedRunningTime="2025-10-05 20:30:02.68168615 +0000 UTC m=+911.530014382" watchObservedRunningTime="2025-10-05 20:30:02.692901519 +0000 UTC m=+911.541229751" Oct 05 20:30:03 crc kubenswrapper[4753]: I1005 20:30:03.689485 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8krg4" event={"ID":"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb","Type":"ContainerStarted","Data":"c6137121bd11980b8614e751e6d5305d0e19602f27fb6a38c0156ecc744397fc"} Oct 05 20:30:03 crc kubenswrapper[4753]: I1005 20:30:03.689610 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8krg4" event={"ID":"61cd2b9f-f08f-4b47-be04-1d9246a5cbdb","Type":"ContainerStarted","Data":"5863b6c51c7d70e27edadb04760ae9250e2ed864cd28c79287a28dc0bb1433c5"} Oct 05 20:30:03 crc kubenswrapper[4753]: I1005 20:30:03.715991 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-8krg4" podStartSLOduration=13.424999352 podStartE2EDuration="22.715973491s" podCreationTimestamp="2025-10-05 20:29:41 +0000 UTC" firstStartedPulling="2025-10-05 20:29:51.801153307 +0000 UTC m=+900.649481549" lastFinishedPulling="2025-10-05 20:30:01.092127446 +0000 UTC m=+909.940455688" observedRunningTime="2025-10-05 20:30:03.709940732 +0000 UTC m=+912.558268964" watchObservedRunningTime="2025-10-05 20:30:03.715973491 +0000 UTC m=+912.564301733" Oct 05 20:30:04 crc kubenswrapper[4753]: I1005 20:30:04.696990 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:30:04 crc kubenswrapper[4753]: I1005 20:30:04.697585 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.425313 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-5xx4p"] Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.426393 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.428717 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.456081 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-5xx4p"] Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.500670 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-ovs-rundir\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.500764 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.500785 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-config\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.500802 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-combined-ca-bundle\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.500822 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz9n2\" (UniqueName: \"kubernetes.io/projected/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-kube-api-access-nz9n2\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.500840 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-ovn-rundir\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.566817 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b8749979c-dkzzq"] Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.568377 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.570071 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.577738 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b8749979c-dkzzq"] Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.601856 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-config\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.601910 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-combined-ca-bundle\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.601940 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz9n2\" (UniqueName: \"kubernetes.io/projected/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-kube-api-access-nz9n2\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.601971 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-ovn-rundir\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.602044 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-config\") pod \"dnsmasq-dns-6b8749979c-dkzzq\" (UID: \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\") " pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.602095 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g44qn\" (UniqueName: \"kubernetes.io/projected/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-kube-api-access-g44qn\") pod \"dnsmasq-dns-6b8749979c-dkzzq\" (UID: \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\") " pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.602130 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-dns-svc\") pod \"dnsmasq-dns-6b8749979c-dkzzq\" (UID: \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\") " pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.602180 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-ovs-rundir\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.602255 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-ovsdbserver-nb\") pod \"dnsmasq-dns-6b8749979c-dkzzq\" (UID: \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\") " pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.602292 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.602423 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-ovn-rundir\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.602695 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-config\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.602764 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-ovs-rundir\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.620377 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz9n2\" (UniqueName: \"kubernetes.io/projected/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-kube-api-access-nz9n2\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.623036 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.631390 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba751b9-51bf-4c3d-8a7a-35af6ebe354f-combined-ca-bundle\") pod \"ovn-controller-metrics-5xx4p\" (UID: \"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f\") " pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.704133 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-ovsdbserver-nb\") pod \"dnsmasq-dns-6b8749979c-dkzzq\" (UID: \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\") " pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.704231 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-config\") pod \"dnsmasq-dns-6b8749979c-dkzzq\" (UID: \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\") " pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.704275 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g44qn\" (UniqueName: \"kubernetes.io/projected/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-kube-api-access-g44qn\") pod \"dnsmasq-dns-6b8749979c-dkzzq\" (UID: \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\") " pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.704301 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-dns-svc\") pod \"dnsmasq-dns-6b8749979c-dkzzq\" (UID: \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\") " pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.705054 4753 generic.go:334] "Generic (PLEG): container finished" podID="c20016b6-f321-4cf2-b09a-d35b96c85805" containerID="9e6a754850d246b37623dca99c71468c99be8b73ea7300884c9d84f7c4845153" exitCode=0 Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.705128 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c20016b6-f321-4cf2-b09a-d35b96c85805","Type":"ContainerDied","Data":"9e6a754850d246b37623dca99c71468c99be8b73ea7300884c9d84f7c4845153"} Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.705672 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-ovsdbserver-nb\") pod \"dnsmasq-dns-6b8749979c-dkzzq\" (UID: \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\") " pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.705769 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-config\") pod \"dnsmasq-dns-6b8749979c-dkzzq\" (UID: \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\") " pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.705948 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-dns-svc\") pod \"dnsmasq-dns-6b8749979c-dkzzq\" (UID: \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\") " pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.707275 4753 generic.go:334] "Generic (PLEG): container finished" podID="81bd134f-0bbd-4cda-b29c-d4d514d4dbe7" containerID="bcf596f51693a845b4c689285dfa23f460fcb3429b8f3f6d6713e26b80b7625c" exitCode=0 Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.707787 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7","Type":"ContainerDied","Data":"bcf596f51693a845b4c689285dfa23f460fcb3429b8f3f6d6713e26b80b7625c"} Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.736946 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g44qn\" (UniqueName: \"kubernetes.io/projected/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-kube-api-access-g44qn\") pod \"dnsmasq-dns-6b8749979c-dkzzq\" (UID: \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\") " pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.745454 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5xx4p" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.848693 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b8749979c-dkzzq"] Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.849239 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.889501 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8454ffc489-w9l78"] Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.890789 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.895736 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.907824 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-ovsdbserver-nb\") pod \"dnsmasq-dns-8454ffc489-w9l78\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.907901 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-ovsdbserver-sb\") pod \"dnsmasq-dns-8454ffc489-w9l78\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.908179 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-dns-svc\") pod \"dnsmasq-dns-8454ffc489-w9l78\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.908250 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8454ffc489-w9l78"] Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.908256 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-config\") pod \"dnsmasq-dns-8454ffc489-w9l78\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:05 crc kubenswrapper[4753]: I1005 20:30:05.908392 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-656dx\" (UniqueName: \"kubernetes.io/projected/ff73c79b-7168-4596-b74c-136ff3bfff2f-kube-api-access-656dx\") pod \"dnsmasq-dns-8454ffc489-w9l78\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:06 crc kubenswrapper[4753]: I1005 20:30:06.010277 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-dns-svc\") pod \"dnsmasq-dns-8454ffc489-w9l78\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:06 crc kubenswrapper[4753]: I1005 20:30:06.010342 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-config\") pod \"dnsmasq-dns-8454ffc489-w9l78\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:06 crc kubenswrapper[4753]: I1005 20:30:06.010404 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-656dx\" (UniqueName: \"kubernetes.io/projected/ff73c79b-7168-4596-b74c-136ff3bfff2f-kube-api-access-656dx\") pod \"dnsmasq-dns-8454ffc489-w9l78\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:06 crc kubenswrapper[4753]: I1005 20:30:06.010444 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-ovsdbserver-nb\") pod \"dnsmasq-dns-8454ffc489-w9l78\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:06 crc kubenswrapper[4753]: I1005 20:30:06.010480 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-ovsdbserver-sb\") pod \"dnsmasq-dns-8454ffc489-w9l78\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:06 crc kubenswrapper[4753]: I1005 20:30:06.011130 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-dns-svc\") pod \"dnsmasq-dns-8454ffc489-w9l78\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:06 crc kubenswrapper[4753]: I1005 20:30:06.011236 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-ovsdbserver-sb\") pod \"dnsmasq-dns-8454ffc489-w9l78\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:06 crc kubenswrapper[4753]: I1005 20:30:06.011737 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-config\") pod \"dnsmasq-dns-8454ffc489-w9l78\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:06 crc kubenswrapper[4753]: I1005 20:30:06.011744 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-ovsdbserver-nb\") pod \"dnsmasq-dns-8454ffc489-w9l78\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:06 crc kubenswrapper[4753]: I1005 20:30:06.028461 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-656dx\" (UniqueName: \"kubernetes.io/projected/ff73c79b-7168-4596-b74c-136ff3bfff2f-kube-api-access-656dx\") pod \"dnsmasq-dns-8454ffc489-w9l78\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:06 crc kubenswrapper[4753]: I1005 20:30:06.207706 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:06 crc kubenswrapper[4753]: I1005 20:30:06.665011 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 05 20:30:07 crc kubenswrapper[4753]: I1005 20:30:07.542501 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99" Oct 05 20:30:07 crc kubenswrapper[4753]: I1005 20:30:07.641534 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ffnz\" (UniqueName: \"kubernetes.io/projected/3f15f854-d530-42ad-a821-537981a408e3-kube-api-access-6ffnz\") pod \"3f15f854-d530-42ad-a821-537981a408e3\" (UID: \"3f15f854-d530-42ad-a821-537981a408e3\") " Oct 05 20:30:07 crc kubenswrapper[4753]: I1005 20:30:07.641599 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f15f854-d530-42ad-a821-537981a408e3-config-volume\") pod \"3f15f854-d530-42ad-a821-537981a408e3\" (UID: \"3f15f854-d530-42ad-a821-537981a408e3\") " Oct 05 20:30:07 crc kubenswrapper[4753]: I1005 20:30:07.641705 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f15f854-d530-42ad-a821-537981a408e3-secret-volume\") pod \"3f15f854-d530-42ad-a821-537981a408e3\" (UID: \"3f15f854-d530-42ad-a821-537981a408e3\") " Oct 05 20:30:07 crc kubenswrapper[4753]: I1005 20:30:07.642644 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f15f854-d530-42ad-a821-537981a408e3-config-volume" (OuterVolumeSpecName: "config-volume") pod "3f15f854-d530-42ad-a821-537981a408e3" (UID: "3f15f854-d530-42ad-a821-537981a408e3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:30:07 crc kubenswrapper[4753]: I1005 20:30:07.654326 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f15f854-d530-42ad-a821-537981a408e3-kube-api-access-6ffnz" (OuterVolumeSpecName: "kube-api-access-6ffnz") pod "3f15f854-d530-42ad-a821-537981a408e3" (UID: "3f15f854-d530-42ad-a821-537981a408e3"). InnerVolumeSpecName "kube-api-access-6ffnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:30:07 crc kubenswrapper[4753]: I1005 20:30:07.654727 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f15f854-d530-42ad-a821-537981a408e3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3f15f854-d530-42ad-a821-537981a408e3" (UID: "3f15f854-d530-42ad-a821-537981a408e3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:30:07 crc kubenswrapper[4753]: I1005 20:30:07.742489 4753 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f15f854-d530-42ad-a821-537981a408e3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:07 crc kubenswrapper[4753]: I1005 20:30:07.742545 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ffnz\" (UniqueName: \"kubernetes.io/projected/3f15f854-d530-42ad-a821-537981a408e3-kube-api-access-6ffnz\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:07 crc kubenswrapper[4753]: I1005 20:30:07.742554 4753 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f15f854-d530-42ad-a821-537981a408e3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:07 crc kubenswrapper[4753]: I1005 20:30:07.768189 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99" event={"ID":"3f15f854-d530-42ad-a821-537981a408e3","Type":"ContainerDied","Data":"ef4358be0409abaf6801eeeffbe05e2078b63a547eeded5e4a8b0ebd2ef2dd4f"} Oct 05 20:30:07 crc kubenswrapper[4753]: I1005 20:30:07.768225 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef4358be0409abaf6801eeeffbe05e2078b63a547eeded5e4a8b0ebd2ef2dd4f" Oct 05 20:30:07 crc kubenswrapper[4753]: I1005 20:30:07.768209 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99" Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.378987 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b8749979c-dkzzq"] Oct 05 20:30:08 crc kubenswrapper[4753]: W1005 20:30:08.383875 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6450cdbe_01f9_4878_89cf_28c2a1d53fa0.slice/crio-9f433679b4ce765b25970f19c812c1201420ea7401bea91a3c5d1acaa63193e1 WatchSource:0}: Error finding container 9f433679b4ce765b25970f19c812c1201420ea7401bea91a3c5d1acaa63193e1: Status 404 returned error can't find the container with id 9f433679b4ce765b25970f19c812c1201420ea7401bea91a3c5d1acaa63193e1 Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.506638 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-5xx4p"] Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.515754 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8454ffc489-w9l78"] Oct 05 20:30:08 crc kubenswrapper[4753]: W1005 20:30:08.527431 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff73c79b_7168_4596_b74c_136ff3bfff2f.slice/crio-6bddacf8424aee76c6b03c21ec58520f325cd5f75f9fdce84847f7a4e59996a9 WatchSource:0}: Error finding container 6bddacf8424aee76c6b03c21ec58520f325cd5f75f9fdce84847f7a4e59996a9: Status 404 returned error can't find the container with id 6bddacf8424aee76c6b03c21ec58520f325cd5f75f9fdce84847f7a4e59996a9 Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.777762 4753 generic.go:334] "Generic (PLEG): container finished" podID="6450cdbe-01f9-4878-89cf-28c2a1d53fa0" containerID="b2ba6ddbe48e511ed670ead9365978fd42df1b83104ad9ff41514b55f53c3f18" exitCode=0 Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.777919 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" event={"ID":"6450cdbe-01f9-4878-89cf-28c2a1d53fa0","Type":"ContainerDied","Data":"b2ba6ddbe48e511ed670ead9365978fd42df1b83104ad9ff41514b55f53c3f18"} Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.778116 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" event={"ID":"6450cdbe-01f9-4878-89cf-28c2a1d53fa0","Type":"ContainerStarted","Data":"9f433679b4ce765b25970f19c812c1201420ea7401bea91a3c5d1acaa63193e1"} Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.781977 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"19914a64-715e-4a20-82fc-f4e86b8e9e21","Type":"ContainerStarted","Data":"511d8454e9fd0c4cf1be631d3581d0ac929800bac762fa329bae48f38f880837"} Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.785703 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5xx4p" event={"ID":"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f","Type":"ContainerStarted","Data":"ebf309ee920ae75c38abaf29f672fe369c61779858f41865d81d32d143d1c26a"} Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.787026 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8454ffc489-w9l78" event={"ID":"ff73c79b-7168-4596-b74c-136ff3bfff2f","Type":"ContainerStarted","Data":"6d0ab7710a68faf6a81708df96132f3f2485902d834accaa608d5eea2bf40135"} Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.787060 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8454ffc489-w9l78" event={"ID":"ff73c79b-7168-4596-b74c-136ff3bfff2f","Type":"ContainerStarted","Data":"6bddacf8424aee76c6b03c21ec58520f325cd5f75f9fdce84847f7a4e59996a9"} Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.789271 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"81bd134f-0bbd-4cda-b29c-d4d514d4dbe7","Type":"ContainerStarted","Data":"8ccfd07504d11b5aae1a11cd9d883f07ab27a53df3fc415c86ddcbbbb43db745"} Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.791429 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"396e74f9-75f0-4643-a011-da8c56174984","Type":"ContainerStarted","Data":"6df0cf09b49d55aede4f4a27b3e9d98a90ddab9bce05f29d70fbd7145e69b4da"} Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.802590 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c20016b6-f321-4cf2-b09a-d35b96c85805","Type":"ContainerStarted","Data":"6f719972984ee9ff4d9f5911a0a5cdb1f4b1ad34745f0920db6e8c669430bddc"} Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.846878 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=24.849035301 podStartE2EDuration="35.846862081s" podCreationTimestamp="2025-10-05 20:29:33 +0000 UTC" firstStartedPulling="2025-10-05 20:29:50.11937643 +0000 UTC m=+898.967704662" lastFinishedPulling="2025-10-05 20:30:01.11720321 +0000 UTC m=+909.965531442" observedRunningTime="2025-10-05 20:30:08.845724076 +0000 UTC m=+917.694052308" watchObservedRunningTime="2025-10-05 20:30:08.846862081 +0000 UTC m=+917.695190313" Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.849390 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.450132078 podStartE2EDuration="28.84938474s" podCreationTimestamp="2025-10-05 20:29:40 +0000 UTC" firstStartedPulling="2025-10-05 20:29:50.634743337 +0000 UTC m=+899.483071569" lastFinishedPulling="2025-10-05 20:30:08.033995999 +0000 UTC m=+916.882324231" observedRunningTime="2025-10-05 20:30:08.824511253 +0000 UTC m=+917.672839485" watchObservedRunningTime="2025-10-05 20:30:08.84938474 +0000 UTC m=+917.697712962" Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.861298 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.53507795 podStartE2EDuration="26.861283632s" podCreationTimestamp="2025-10-05 20:29:42 +0000 UTC" firstStartedPulling="2025-10-05 20:29:51.68995124 +0000 UTC m=+900.538279472" lastFinishedPulling="2025-10-05 20:30:08.016156922 +0000 UTC m=+916.864485154" observedRunningTime="2025-10-05 20:30:08.859577618 +0000 UTC m=+917.707905850" watchObservedRunningTime="2025-10-05 20:30:08.861283632 +0000 UTC m=+917.709611864" Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.901873 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-5xx4p" podStartSLOduration=3.901854599 podStartE2EDuration="3.901854599s" podCreationTimestamp="2025-10-05 20:30:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:30:08.900761435 +0000 UTC m=+917.749089657" watchObservedRunningTime="2025-10-05 20:30:08.901854599 +0000 UTC m=+917.750182831" Oct 05 20:30:08 crc kubenswrapper[4753]: I1005 20:30:08.956444 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.534656024 podStartE2EDuration="35.956427275s" podCreationTimestamp="2025-10-05 20:29:33 +0000 UTC" firstStartedPulling="2025-10-05 20:29:49.651159535 +0000 UTC m=+898.499487767" lastFinishedPulling="2025-10-05 20:30:01.072930796 +0000 UTC m=+909.921259018" observedRunningTime="2025-10-05 20:30:08.948646642 +0000 UTC m=+917.796974894" watchObservedRunningTime="2025-10-05 20:30:08.956427275 +0000 UTC m=+917.804755497" Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.038321 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.170974 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-ovsdbserver-nb\") pod \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\" (UID: \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\") " Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.171031 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-dns-svc\") pod \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\" (UID: \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\") " Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.171119 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-config\") pod \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\" (UID: \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\") " Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.171190 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g44qn\" (UniqueName: \"kubernetes.io/projected/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-kube-api-access-g44qn\") pod \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\" (UID: \"6450cdbe-01f9-4878-89cf-28c2a1d53fa0\") " Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.181201 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-kube-api-access-g44qn" (OuterVolumeSpecName: "kube-api-access-g44qn") pod "6450cdbe-01f9-4878-89cf-28c2a1d53fa0" (UID: "6450cdbe-01f9-4878-89cf-28c2a1d53fa0"). InnerVolumeSpecName "kube-api-access-g44qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.190120 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6450cdbe-01f9-4878-89cf-28c2a1d53fa0" (UID: "6450cdbe-01f9-4878-89cf-28c2a1d53fa0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.190183 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6450cdbe-01f9-4878-89cf-28c2a1d53fa0" (UID: "6450cdbe-01f9-4878-89cf-28c2a1d53fa0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.196683 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-config" (OuterVolumeSpecName: "config") pod "6450cdbe-01f9-4878-89cf-28c2a1d53fa0" (UID: "6450cdbe-01f9-4878-89cf-28c2a1d53fa0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.272791 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.272995 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.273053 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.273127 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g44qn\" (UniqueName: \"kubernetes.io/projected/6450cdbe-01f9-4878-89cf-28c2a1d53fa0-kube-api-access-g44qn\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.820024 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" event={"ID":"6450cdbe-01f9-4878-89cf-28c2a1d53fa0","Type":"ContainerDied","Data":"9f433679b4ce765b25970f19c812c1201420ea7401bea91a3c5d1acaa63193e1"} Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.820116 4753 scope.go:117] "RemoveContainer" containerID="b2ba6ddbe48e511ed670ead9365978fd42df1b83104ad9ff41514b55f53c3f18" Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.820276 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.832429 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5xx4p" event={"ID":"2ba751b9-51bf-4c3d-8a7a-35af6ebe354f","Type":"ContainerStarted","Data":"2d23c5f09708d7474f419bcdc3732f9bfa2f345bb3bcaccc292156ebfca84fd5"} Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.836935 4753 generic.go:334] "Generic (PLEG): container finished" podID="ff73c79b-7168-4596-b74c-136ff3bfff2f" containerID="6d0ab7710a68faf6a81708df96132f3f2485902d834accaa608d5eea2bf40135" exitCode=0 Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.838385 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8454ffc489-w9l78" event={"ID":"ff73c79b-7168-4596-b74c-136ff3bfff2f","Type":"ContainerDied","Data":"6d0ab7710a68faf6a81708df96132f3f2485902d834accaa608d5eea2bf40135"} Oct 05 20:30:09 crc kubenswrapper[4753]: I1005 20:30:09.892883 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 05 20:30:10 crc kubenswrapper[4753]: I1005 20:30:10.522277 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 05 20:30:10 crc kubenswrapper[4753]: I1005 20:30:10.589122 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 05 20:30:10 crc kubenswrapper[4753]: I1005 20:30:10.845747 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8454ffc489-w9l78" event={"ID":"ff73c79b-7168-4596-b74c-136ff3bfff2f","Type":"ContainerStarted","Data":"e5a1bcd610b0456cf97746c2c7f514702f64262ca03ee94df46bef50b2bb7436"} Oct 05 20:30:10 crc kubenswrapper[4753]: I1005 20:30:10.845934 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 05 20:30:10 crc kubenswrapper[4753]: I1005 20:30:10.868966 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8454ffc489-w9l78" podStartSLOduration=5.868948682 podStartE2EDuration="5.868948682s" podCreationTimestamp="2025-10-05 20:30:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:30:10.863754129 +0000 UTC m=+919.712082361" watchObservedRunningTime="2025-10-05 20:30:10.868948682 +0000 UTC m=+919.717276914" Oct 05 20:30:10 crc kubenswrapper[4753]: I1005 20:30:10.890191 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 05 20:30:11 crc kubenswrapper[4753]: I1005 20:30:11.208390 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:11 crc kubenswrapper[4753]: I1005 20:30:11.462986 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 05 20:30:11 crc kubenswrapper[4753]: I1005 20:30:11.464033 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 05 20:30:11 crc kubenswrapper[4753]: I1005 20:30:11.513389 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 05 20:30:11 crc kubenswrapper[4753]: I1005 20:30:11.903449 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.098523 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 05 20:30:12 crc kubenswrapper[4753]: E1005 20:30:12.098810 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f15f854-d530-42ad-a821-537981a408e3" containerName="collect-profiles" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.098821 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f15f854-d530-42ad-a821-537981a408e3" containerName="collect-profiles" Oct 05 20:30:12 crc kubenswrapper[4753]: E1005 20:30:12.098836 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6450cdbe-01f9-4878-89cf-28c2a1d53fa0" containerName="init" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.098842 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6450cdbe-01f9-4878-89cf-28c2a1d53fa0" containerName="init" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.100410 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6450cdbe-01f9-4878-89cf-28c2a1d53fa0" containerName="init" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.100429 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f15f854-d530-42ad-a821-537981a408e3" containerName="collect-profiles" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.101232 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.104869 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-srxdn" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.105318 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.105454 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.108956 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.126950 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.245619 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0410bd72-9899-4174-9258-4efbdc6cd7c8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.245672 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdhwz\" (UniqueName: \"kubernetes.io/projected/0410bd72-9899-4174-9258-4efbdc6cd7c8-kube-api-access-qdhwz\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.245704 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0410bd72-9899-4174-9258-4efbdc6cd7c8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.245762 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0410bd72-9899-4174-9258-4efbdc6cd7c8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.245999 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0410bd72-9899-4174-9258-4efbdc6cd7c8-config\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.246065 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0410bd72-9899-4174-9258-4efbdc6cd7c8-scripts\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.246295 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/0410bd72-9899-4174-9258-4efbdc6cd7c8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.347831 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0410bd72-9899-4174-9258-4efbdc6cd7c8-config\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.347874 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0410bd72-9899-4174-9258-4efbdc6cd7c8-scripts\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.347919 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/0410bd72-9899-4174-9258-4efbdc6cd7c8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.347938 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0410bd72-9899-4174-9258-4efbdc6cd7c8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.347955 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdhwz\" (UniqueName: \"kubernetes.io/projected/0410bd72-9899-4174-9258-4efbdc6cd7c8-kube-api-access-qdhwz\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.347991 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0410bd72-9899-4174-9258-4efbdc6cd7c8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.348025 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0410bd72-9899-4174-9258-4efbdc6cd7c8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.349100 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0410bd72-9899-4174-9258-4efbdc6cd7c8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.349570 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0410bd72-9899-4174-9258-4efbdc6cd7c8-scripts\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.350272 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0410bd72-9899-4174-9258-4efbdc6cd7c8-config\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.357130 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/0410bd72-9899-4174-9258-4efbdc6cd7c8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.363028 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0410bd72-9899-4174-9258-4efbdc6cd7c8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.365197 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0410bd72-9899-4174-9258-4efbdc6cd7c8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.368927 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdhwz\" (UniqueName: \"kubernetes.io/projected/0410bd72-9899-4174-9258-4efbdc6cd7c8-kube-api-access-qdhwz\") pod \"ovn-northd-0\" (UID: \"0410bd72-9899-4174-9258-4efbdc6cd7c8\") " pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.419407 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.829731 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 05 20:30:12 crc kubenswrapper[4753]: I1005 20:30:12.865025 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0410bd72-9899-4174-9258-4efbdc6cd7c8","Type":"ContainerStarted","Data":"6a09dbed0c44e0d426b1de5f85004e562ca173946fd64c02d2479cac03ba13dc"} Oct 05 20:30:14 crc kubenswrapper[4753]: I1005 20:30:14.749262 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 05 20:30:14 crc kubenswrapper[4753]: I1005 20:30:14.749862 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 05 20:30:14 crc kubenswrapper[4753]: I1005 20:30:14.790940 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 05 20:30:14 crc kubenswrapper[4753]: I1005 20:30:14.820590 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 05 20:30:14 crc kubenswrapper[4753]: I1005 20:30:14.820628 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 05 20:30:14 crc kubenswrapper[4753]: I1005 20:30:14.865981 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 05 20:30:14 crc kubenswrapper[4753]: I1005 20:30:14.880327 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0410bd72-9899-4174-9258-4efbdc6cd7c8","Type":"ContainerStarted","Data":"b054a322819899e7a005e28cbda12cc0a5c2207b6bf52e0cea256457d5b0e909"} Oct 05 20:30:14 crc kubenswrapper[4753]: I1005 20:30:14.881890 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0410bd72-9899-4174-9258-4efbdc6cd7c8","Type":"ContainerStarted","Data":"5fcdd0d64cb97916f9def9e828e7190af8224750890efa8f5e3e61ba6bad081f"} Oct 05 20:30:14 crc kubenswrapper[4753]: I1005 20:30:14.918625 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.430945864 podStartE2EDuration="2.918607384s" podCreationTimestamp="2025-10-05 20:30:12 +0000 UTC" firstStartedPulling="2025-10-05 20:30:12.839935204 +0000 UTC m=+921.688263436" lastFinishedPulling="2025-10-05 20:30:14.327596724 +0000 UTC m=+923.175924956" observedRunningTime="2025-10-05 20:30:14.908168498 +0000 UTC m=+923.756496730" watchObservedRunningTime="2025-10-05 20:30:14.918607384 +0000 UTC m=+923.766935626" Oct 05 20:30:14 crc kubenswrapper[4753]: I1005 20:30:14.929773 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 05 20:30:14 crc kubenswrapper[4753]: I1005 20:30:14.935950 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 05 20:30:15 crc kubenswrapper[4753]: I1005 20:30:15.887316 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.209097 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.261998 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d4d9f7875-wd8wh"] Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.262283 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" podUID="c0adcc08-1f43-48d5-936d-83797872bb43" containerName="dnsmasq-dns" containerID="cri-o://7b348c9b0f4c9037701ad7dbf03f361e6b34eae12a474f170e03f2a2d8ee5a6c" gracePeriod=10 Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.705883 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.818225 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0adcc08-1f43-48d5-936d-83797872bb43-dns-svc\") pod \"c0adcc08-1f43-48d5-936d-83797872bb43\" (UID: \"c0adcc08-1f43-48d5-936d-83797872bb43\") " Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.818268 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0adcc08-1f43-48d5-936d-83797872bb43-config\") pod \"c0adcc08-1f43-48d5-936d-83797872bb43\" (UID: \"c0adcc08-1f43-48d5-936d-83797872bb43\") " Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.818327 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdnk6\" (UniqueName: \"kubernetes.io/projected/c0adcc08-1f43-48d5-936d-83797872bb43-kube-api-access-wdnk6\") pod \"c0adcc08-1f43-48d5-936d-83797872bb43\" (UID: \"c0adcc08-1f43-48d5-936d-83797872bb43\") " Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.843356 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0adcc08-1f43-48d5-936d-83797872bb43-kube-api-access-wdnk6" (OuterVolumeSpecName: "kube-api-access-wdnk6") pod "c0adcc08-1f43-48d5-936d-83797872bb43" (UID: "c0adcc08-1f43-48d5-936d-83797872bb43"). InnerVolumeSpecName "kube-api-access-wdnk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.890059 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0adcc08-1f43-48d5-936d-83797872bb43-config" (OuterVolumeSpecName: "config") pod "c0adcc08-1f43-48d5-936d-83797872bb43" (UID: "c0adcc08-1f43-48d5-936d-83797872bb43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.898609 4753 generic.go:334] "Generic (PLEG): container finished" podID="c0adcc08-1f43-48d5-936d-83797872bb43" containerID="7b348c9b0f4c9037701ad7dbf03f361e6b34eae12a474f170e03f2a2d8ee5a6c" exitCode=0 Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.898936 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.899133 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" event={"ID":"c0adcc08-1f43-48d5-936d-83797872bb43","Type":"ContainerDied","Data":"7b348c9b0f4c9037701ad7dbf03f361e6b34eae12a474f170e03f2a2d8ee5a6c"} Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.899215 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d4d9f7875-wd8wh" event={"ID":"c0adcc08-1f43-48d5-936d-83797872bb43","Type":"ContainerDied","Data":"f689e810df661a24228e3db72b23b19a7827c455161d1c4d056ae16af3838ea6"} Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.899237 4753 scope.go:117] "RemoveContainer" containerID="7b348c9b0f4c9037701ad7dbf03f361e6b34eae12a474f170e03f2a2d8ee5a6c" Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.902397 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0adcc08-1f43-48d5-936d-83797872bb43-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c0adcc08-1f43-48d5-936d-83797872bb43" (UID: "c0adcc08-1f43-48d5-936d-83797872bb43"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.921356 4753 scope.go:117] "RemoveContainer" containerID="b5a1ef213ffa6ddec609c2f2dfcc8bc5e7a2343793097bbd1c3cf2827a5fe6f1" Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.921645 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0adcc08-1f43-48d5-936d-83797872bb43-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.921671 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0adcc08-1f43-48d5-936d-83797872bb43-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.921680 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdnk6\" (UniqueName: \"kubernetes.io/projected/c0adcc08-1f43-48d5-936d-83797872bb43-kube-api-access-wdnk6\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.946660 4753 scope.go:117] "RemoveContainer" containerID="7b348c9b0f4c9037701ad7dbf03f361e6b34eae12a474f170e03f2a2d8ee5a6c" Oct 05 20:30:16 crc kubenswrapper[4753]: E1005 20:30:16.947729 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b348c9b0f4c9037701ad7dbf03f361e6b34eae12a474f170e03f2a2d8ee5a6c\": container with ID starting with 7b348c9b0f4c9037701ad7dbf03f361e6b34eae12a474f170e03f2a2d8ee5a6c not found: ID does not exist" containerID="7b348c9b0f4c9037701ad7dbf03f361e6b34eae12a474f170e03f2a2d8ee5a6c" Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.947770 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b348c9b0f4c9037701ad7dbf03f361e6b34eae12a474f170e03f2a2d8ee5a6c"} err="failed to get container status \"7b348c9b0f4c9037701ad7dbf03f361e6b34eae12a474f170e03f2a2d8ee5a6c\": rpc error: code = NotFound desc = could not find container \"7b348c9b0f4c9037701ad7dbf03f361e6b34eae12a474f170e03f2a2d8ee5a6c\": container with ID starting with 7b348c9b0f4c9037701ad7dbf03f361e6b34eae12a474f170e03f2a2d8ee5a6c not found: ID does not exist" Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.947797 4753 scope.go:117] "RemoveContainer" containerID="b5a1ef213ffa6ddec609c2f2dfcc8bc5e7a2343793097bbd1c3cf2827a5fe6f1" Oct 05 20:30:16 crc kubenswrapper[4753]: E1005 20:30:16.948036 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5a1ef213ffa6ddec609c2f2dfcc8bc5e7a2343793097bbd1c3cf2827a5fe6f1\": container with ID starting with b5a1ef213ffa6ddec609c2f2dfcc8bc5e7a2343793097bbd1c3cf2827a5fe6f1 not found: ID does not exist" containerID="b5a1ef213ffa6ddec609c2f2dfcc8bc5e7a2343793097bbd1c3cf2827a5fe6f1" Oct 05 20:30:16 crc kubenswrapper[4753]: I1005 20:30:16.948055 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5a1ef213ffa6ddec609c2f2dfcc8bc5e7a2343793097bbd1c3cf2827a5fe6f1"} err="failed to get container status \"b5a1ef213ffa6ddec609c2f2dfcc8bc5e7a2343793097bbd1c3cf2827a5fe6f1\": rpc error: code = NotFound desc = could not find container \"b5a1ef213ffa6ddec609c2f2dfcc8bc5e7a2343793097bbd1c3cf2827a5fe6f1\": container with ID starting with b5a1ef213ffa6ddec609c2f2dfcc8bc5e7a2343793097bbd1c3cf2827a5fe6f1 not found: ID does not exist" Oct 05 20:30:17 crc kubenswrapper[4753]: I1005 20:30:17.225732 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d4d9f7875-wd8wh"] Oct 05 20:30:17 crc kubenswrapper[4753]: I1005 20:30:17.231712 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d4d9f7875-wd8wh"] Oct 05 20:30:17 crc kubenswrapper[4753]: I1005 20:30:17.864919 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0adcc08-1f43-48d5-936d-83797872bb43" path="/var/lib/kubelet/pods/c0adcc08-1f43-48d5-936d-83797872bb43/volumes" Oct 05 20:30:20 crc kubenswrapper[4753]: I1005 20:30:20.151594 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-ps278"] Oct 05 20:30:20 crc kubenswrapper[4753]: E1005 20:30:20.152556 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0adcc08-1f43-48d5-936d-83797872bb43" containerName="init" Oct 05 20:30:20 crc kubenswrapper[4753]: I1005 20:30:20.152576 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0adcc08-1f43-48d5-936d-83797872bb43" containerName="init" Oct 05 20:30:20 crc kubenswrapper[4753]: E1005 20:30:20.152603 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0adcc08-1f43-48d5-936d-83797872bb43" containerName="dnsmasq-dns" Oct 05 20:30:20 crc kubenswrapper[4753]: I1005 20:30:20.152612 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0adcc08-1f43-48d5-936d-83797872bb43" containerName="dnsmasq-dns" Oct 05 20:30:20 crc kubenswrapper[4753]: I1005 20:30:20.152797 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0adcc08-1f43-48d5-936d-83797872bb43" containerName="dnsmasq-dns" Oct 05 20:30:20 crc kubenswrapper[4753]: I1005 20:30:20.153483 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ps278" Oct 05 20:30:20 crc kubenswrapper[4753]: I1005 20:30:20.174489 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhzp2\" (UniqueName: \"kubernetes.io/projected/a5bca284-a880-4ea8-a0fa-74a08ac1f840-kube-api-access-fhzp2\") pod \"glance-db-create-ps278\" (UID: \"a5bca284-a880-4ea8-a0fa-74a08ac1f840\") " pod="openstack/glance-db-create-ps278" Oct 05 20:30:20 crc kubenswrapper[4753]: I1005 20:30:20.179299 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ps278"] Oct 05 20:30:20 crc kubenswrapper[4753]: I1005 20:30:20.275651 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhzp2\" (UniqueName: \"kubernetes.io/projected/a5bca284-a880-4ea8-a0fa-74a08ac1f840-kube-api-access-fhzp2\") pod \"glance-db-create-ps278\" (UID: \"a5bca284-a880-4ea8-a0fa-74a08ac1f840\") " pod="openstack/glance-db-create-ps278" Oct 05 20:30:20 crc kubenswrapper[4753]: I1005 20:30:20.294785 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhzp2\" (UniqueName: \"kubernetes.io/projected/a5bca284-a880-4ea8-a0fa-74a08ac1f840-kube-api-access-fhzp2\") pod \"glance-db-create-ps278\" (UID: \"a5bca284-a880-4ea8-a0fa-74a08ac1f840\") " pod="openstack/glance-db-create-ps278" Oct 05 20:30:20 crc kubenswrapper[4753]: I1005 20:30:20.476561 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ps278" Oct 05 20:30:20 crc kubenswrapper[4753]: I1005 20:30:20.924610 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ps278"] Oct 05 20:30:20 crc kubenswrapper[4753]: W1005 20:30:20.928492 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5bca284_a880_4ea8_a0fa_74a08ac1f840.slice/crio-adf3c7e46174079a211484b8bc52bf97c512f8170e4fa260d220ccfd8bbc35c4 WatchSource:0}: Error finding container adf3c7e46174079a211484b8bc52bf97c512f8170e4fa260d220ccfd8bbc35c4: Status 404 returned error can't find the container with id adf3c7e46174079a211484b8bc52bf97c512f8170e4fa260d220ccfd8bbc35c4 Oct 05 20:30:21 crc kubenswrapper[4753]: I1005 20:30:21.936676 4753 generic.go:334] "Generic (PLEG): container finished" podID="a5bca284-a880-4ea8-a0fa-74a08ac1f840" containerID="617ff2b16eef7f0782a945c8a904bf55feba537c39ada4f52a808e644ca179fe" exitCode=0 Oct 05 20:30:21 crc kubenswrapper[4753]: I1005 20:30:21.936723 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ps278" event={"ID":"a5bca284-a880-4ea8-a0fa-74a08ac1f840","Type":"ContainerDied","Data":"617ff2b16eef7f0782a945c8a904bf55feba537c39ada4f52a808e644ca179fe"} Oct 05 20:30:21 crc kubenswrapper[4753]: I1005 20:30:21.937963 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ps278" event={"ID":"a5bca284-a880-4ea8-a0fa-74a08ac1f840","Type":"ContainerStarted","Data":"adf3c7e46174079a211484b8bc52bf97c512f8170e4fa260d220ccfd8bbc35c4"} Oct 05 20:30:22 crc kubenswrapper[4753]: I1005 20:30:22.947711 4753 generic.go:334] "Generic (PLEG): container finished" podID="73d182e9-8e4b-46ce-aa0b-6fd751eefecd" containerID="a730afd36443ffc2b80abbc60e2cca6017825e03868c24ebe4bb442fedc5ae24" exitCode=0 Oct 05 20:30:22 crc kubenswrapper[4753]: I1005 20:30:22.947804 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"73d182e9-8e4b-46ce-aa0b-6fd751eefecd","Type":"ContainerDied","Data":"a730afd36443ffc2b80abbc60e2cca6017825e03868c24ebe4bb442fedc5ae24"} Oct 05 20:30:22 crc kubenswrapper[4753]: I1005 20:30:22.952062 4753 generic.go:334] "Generic (PLEG): container finished" podID="6f77d64e-7c7a-4770-9710-8c4aa767bcfa" containerID="5147e49072301e23beb6f2459787fe2e686175dda9c52f176ca87e213655d772" exitCode=0 Oct 05 20:30:22 crc kubenswrapper[4753]: I1005 20:30:22.952539 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f77d64e-7c7a-4770-9710-8c4aa767bcfa","Type":"ContainerDied","Data":"5147e49072301e23beb6f2459787fe2e686175dda9c52f176ca87e213655d772"} Oct 05 20:30:23 crc kubenswrapper[4753]: I1005 20:30:23.337833 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ps278" Oct 05 20:30:23 crc kubenswrapper[4753]: I1005 20:30:23.436805 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhzp2\" (UniqueName: \"kubernetes.io/projected/a5bca284-a880-4ea8-a0fa-74a08ac1f840-kube-api-access-fhzp2\") pod \"a5bca284-a880-4ea8-a0fa-74a08ac1f840\" (UID: \"a5bca284-a880-4ea8-a0fa-74a08ac1f840\") " Oct 05 20:30:23 crc kubenswrapper[4753]: I1005 20:30:23.442425 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5bca284-a880-4ea8-a0fa-74a08ac1f840-kube-api-access-fhzp2" (OuterVolumeSpecName: "kube-api-access-fhzp2") pod "a5bca284-a880-4ea8-a0fa-74a08ac1f840" (UID: "a5bca284-a880-4ea8-a0fa-74a08ac1f840"). InnerVolumeSpecName "kube-api-access-fhzp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:30:23 crc kubenswrapper[4753]: I1005 20:30:23.538184 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhzp2\" (UniqueName: \"kubernetes.io/projected/a5bca284-a880-4ea8-a0fa-74a08ac1f840-kube-api-access-fhzp2\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:23 crc kubenswrapper[4753]: I1005 20:30:23.960166 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ps278" event={"ID":"a5bca284-a880-4ea8-a0fa-74a08ac1f840","Type":"ContainerDied","Data":"adf3c7e46174079a211484b8bc52bf97c512f8170e4fa260d220ccfd8bbc35c4"} Oct 05 20:30:23 crc kubenswrapper[4753]: I1005 20:30:23.960210 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adf3c7e46174079a211484b8bc52bf97c512f8170e4fa260d220ccfd8bbc35c4" Oct 05 20:30:23 crc kubenswrapper[4753]: I1005 20:30:23.960271 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ps278" Oct 05 20:30:23 crc kubenswrapper[4753]: I1005 20:30:23.964018 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f77d64e-7c7a-4770-9710-8c4aa767bcfa","Type":"ContainerStarted","Data":"527b88c7346a96619d1e411b8d16303db9c661761cf232b2dba5107da80d25a0"} Oct 05 20:30:23 crc kubenswrapper[4753]: I1005 20:30:23.964413 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:30:23 crc kubenswrapper[4753]: I1005 20:30:23.966150 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"73d182e9-8e4b-46ce-aa0b-6fd751eefecd","Type":"ContainerStarted","Data":"e838dbccfedc74c736ecacf9ded9f3acd151492c5726744b237bf41a76e69e42"} Oct 05 20:30:23 crc kubenswrapper[4753]: I1005 20:30:23.966354 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 05 20:30:24 crc kubenswrapper[4753]: I1005 20:30:24.023525 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=44.82162458 podStartE2EDuration="55.023507142s" podCreationTimestamp="2025-10-05 20:29:29 +0000 UTC" firstStartedPulling="2025-10-05 20:29:39.605941139 +0000 UTC m=+888.454269371" lastFinishedPulling="2025-10-05 20:29:49.807823701 +0000 UTC m=+898.656151933" observedRunningTime="2025-10-05 20:30:23.995641961 +0000 UTC m=+932.843970193" watchObservedRunningTime="2025-10-05 20:30:24.023507142 +0000 UTC m=+932.871835374" Oct 05 20:30:24 crc kubenswrapper[4753]: I1005 20:30:24.024720 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=43.772454787 podStartE2EDuration="54.02471556s" podCreationTimestamp="2025-10-05 20:29:30 +0000 UTC" firstStartedPulling="2025-10-05 20:29:39.592002578 +0000 UTC m=+888.440330810" lastFinishedPulling="2025-10-05 20:29:49.844263351 +0000 UTC m=+898.692591583" observedRunningTime="2025-10-05 20:30:24.018854507 +0000 UTC m=+932.867182739" watchObservedRunningTime="2025-10-05 20:30:24.02471556 +0000 UTC m=+932.873043792" Oct 05 20:30:24 crc kubenswrapper[4753]: I1005 20:30:24.352400 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-7h44h"] Oct 05 20:30:24 crc kubenswrapper[4753]: E1005 20:30:24.353285 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bca284-a880-4ea8-a0fa-74a08ac1f840" containerName="mariadb-database-create" Oct 05 20:30:24 crc kubenswrapper[4753]: I1005 20:30:24.353371 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bca284-a880-4ea8-a0fa-74a08ac1f840" containerName="mariadb-database-create" Oct 05 20:30:24 crc kubenswrapper[4753]: I1005 20:30:24.353582 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5bca284-a880-4ea8-a0fa-74a08ac1f840" containerName="mariadb-database-create" Oct 05 20:30:24 crc kubenswrapper[4753]: I1005 20:30:24.354101 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7h44h" Oct 05 20:30:24 crc kubenswrapper[4753]: I1005 20:30:24.375032 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7h44h"] Oct 05 20:30:24 crc kubenswrapper[4753]: I1005 20:30:24.451492 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltwx4\" (UniqueName: \"kubernetes.io/projected/51a13c89-fe4e-42be-88c4-fe420d90f169-kube-api-access-ltwx4\") pod \"keystone-db-create-7h44h\" (UID: \"51a13c89-fe4e-42be-88c4-fe420d90f169\") " pod="openstack/keystone-db-create-7h44h" Oct 05 20:30:24 crc kubenswrapper[4753]: I1005 20:30:24.552810 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltwx4\" (UniqueName: \"kubernetes.io/projected/51a13c89-fe4e-42be-88c4-fe420d90f169-kube-api-access-ltwx4\") pod \"keystone-db-create-7h44h\" (UID: \"51a13c89-fe4e-42be-88c4-fe420d90f169\") " pod="openstack/keystone-db-create-7h44h" Oct 05 20:30:24 crc kubenswrapper[4753]: I1005 20:30:24.589574 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltwx4\" (UniqueName: \"kubernetes.io/projected/51a13c89-fe4e-42be-88c4-fe420d90f169-kube-api-access-ltwx4\") pod \"keystone-db-create-7h44h\" (UID: \"51a13c89-fe4e-42be-88c4-fe420d90f169\") " pod="openstack/keystone-db-create-7h44h" Oct 05 20:30:24 crc kubenswrapper[4753]: I1005 20:30:24.672641 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7h44h" Oct 05 20:30:24 crc kubenswrapper[4753]: I1005 20:30:24.808779 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-sd4c8"] Oct 05 20:30:24 crc kubenswrapper[4753]: I1005 20:30:24.810092 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sd4c8" Oct 05 20:30:24 crc kubenswrapper[4753]: I1005 20:30:24.827367 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-sd4c8"] Oct 05 20:30:24 crc kubenswrapper[4753]: I1005 20:30:24.967546 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6svq\" (UniqueName: \"kubernetes.io/projected/89d998ff-850f-4cc5-a430-4671c9a7b68a-kube-api-access-z6svq\") pod \"placement-db-create-sd4c8\" (UID: \"89d998ff-850f-4cc5-a430-4671c9a7b68a\") " pod="openstack/placement-db-create-sd4c8" Oct 05 20:30:25 crc kubenswrapper[4753]: I1005 20:30:25.070773 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6svq\" (UniqueName: \"kubernetes.io/projected/89d998ff-850f-4cc5-a430-4671c9a7b68a-kube-api-access-z6svq\") pod \"placement-db-create-sd4c8\" (UID: \"89d998ff-850f-4cc5-a430-4671c9a7b68a\") " pod="openstack/placement-db-create-sd4c8" Oct 05 20:30:25 crc kubenswrapper[4753]: I1005 20:30:25.099435 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6svq\" (UniqueName: \"kubernetes.io/projected/89d998ff-850f-4cc5-a430-4671c9a7b68a-kube-api-access-z6svq\") pod \"placement-db-create-sd4c8\" (UID: \"89d998ff-850f-4cc5-a430-4671c9a7b68a\") " pod="openstack/placement-db-create-sd4c8" Oct 05 20:30:25 crc kubenswrapper[4753]: I1005 20:30:25.141031 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sd4c8" Oct 05 20:30:25 crc kubenswrapper[4753]: I1005 20:30:25.187152 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7h44h"] Oct 05 20:30:25 crc kubenswrapper[4753]: W1005 20:30:25.193972 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51a13c89_fe4e_42be_88c4_fe420d90f169.slice/crio-a13ccfb7fe30faad834179d52229e4b3d47911a882de27ed571aafa84337ff8b WatchSource:0}: Error finding container a13ccfb7fe30faad834179d52229e4b3d47911a882de27ed571aafa84337ff8b: Status 404 returned error can't find the container with id a13ccfb7fe30faad834179d52229e4b3d47911a882de27ed571aafa84337ff8b Oct 05 20:30:25 crc kubenswrapper[4753]: I1005 20:30:25.581550 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-sd4c8"] Oct 05 20:30:25 crc kubenswrapper[4753]: W1005 20:30:25.581591 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89d998ff_850f_4cc5_a430_4671c9a7b68a.slice/crio-d9364b96524691efa5a56896344a415762c5bebc96919ad55cb2e70677a22e12 WatchSource:0}: Error finding container d9364b96524691efa5a56896344a415762c5bebc96919ad55cb2e70677a22e12: Status 404 returned error can't find the container with id d9364b96524691efa5a56896344a415762c5bebc96919ad55cb2e70677a22e12 Oct 05 20:30:25 crc kubenswrapper[4753]: I1005 20:30:25.988553 4753 generic.go:334] "Generic (PLEG): container finished" podID="51a13c89-fe4e-42be-88c4-fe420d90f169" containerID="f5559f50204feea7cdc73ca6185cb0db5f58e42fb098fc76c6812f6e426a5bd4" exitCode=0 Oct 05 20:30:25 crc kubenswrapper[4753]: I1005 20:30:25.988630 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7h44h" event={"ID":"51a13c89-fe4e-42be-88c4-fe420d90f169","Type":"ContainerDied","Data":"f5559f50204feea7cdc73ca6185cb0db5f58e42fb098fc76c6812f6e426a5bd4"} Oct 05 20:30:25 crc kubenswrapper[4753]: I1005 20:30:25.988676 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7h44h" event={"ID":"51a13c89-fe4e-42be-88c4-fe420d90f169","Type":"ContainerStarted","Data":"a13ccfb7fe30faad834179d52229e4b3d47911a882de27ed571aafa84337ff8b"} Oct 05 20:30:25 crc kubenswrapper[4753]: I1005 20:30:25.989721 4753 generic.go:334] "Generic (PLEG): container finished" podID="89d998ff-850f-4cc5-a430-4671c9a7b68a" containerID="a5bc681ae021c2271b974c44eaf1a85451c716bbca34287851922de276ce87a0" exitCode=0 Oct 05 20:30:25 crc kubenswrapper[4753]: I1005 20:30:25.989765 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sd4c8" event={"ID":"89d998ff-850f-4cc5-a430-4671c9a7b68a","Type":"ContainerDied","Data":"a5bc681ae021c2271b974c44eaf1a85451c716bbca34287851922de276ce87a0"} Oct 05 20:30:25 crc kubenswrapper[4753]: I1005 20:30:25.989792 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sd4c8" event={"ID":"89d998ff-850f-4cc5-a430-4671c9a7b68a","Type":"ContainerStarted","Data":"d9364b96524691efa5a56896344a415762c5bebc96919ad55cb2e70677a22e12"} Oct 05 20:30:27 crc kubenswrapper[4753]: I1005 20:30:27.410519 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sd4c8" Oct 05 20:30:27 crc kubenswrapper[4753]: I1005 20:30:27.415937 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7h44h" Oct 05 20:30:27 crc kubenswrapper[4753]: I1005 20:30:27.495206 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 05 20:30:27 crc kubenswrapper[4753]: I1005 20:30:27.518631 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6svq\" (UniqueName: \"kubernetes.io/projected/89d998ff-850f-4cc5-a430-4671c9a7b68a-kube-api-access-z6svq\") pod \"89d998ff-850f-4cc5-a430-4671c9a7b68a\" (UID: \"89d998ff-850f-4cc5-a430-4671c9a7b68a\") " Oct 05 20:30:27 crc kubenswrapper[4753]: I1005 20:30:27.518754 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltwx4\" (UniqueName: \"kubernetes.io/projected/51a13c89-fe4e-42be-88c4-fe420d90f169-kube-api-access-ltwx4\") pod \"51a13c89-fe4e-42be-88c4-fe420d90f169\" (UID: \"51a13c89-fe4e-42be-88c4-fe420d90f169\") " Oct 05 20:30:27 crc kubenswrapper[4753]: I1005 20:30:27.536506 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a13c89-fe4e-42be-88c4-fe420d90f169-kube-api-access-ltwx4" (OuterVolumeSpecName: "kube-api-access-ltwx4") pod "51a13c89-fe4e-42be-88c4-fe420d90f169" (UID: "51a13c89-fe4e-42be-88c4-fe420d90f169"). InnerVolumeSpecName "kube-api-access-ltwx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:30:27 crc kubenswrapper[4753]: I1005 20:30:27.536591 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89d998ff-850f-4cc5-a430-4671c9a7b68a-kube-api-access-z6svq" (OuterVolumeSpecName: "kube-api-access-z6svq") pod "89d998ff-850f-4cc5-a430-4671c9a7b68a" (UID: "89d998ff-850f-4cc5-a430-4671c9a7b68a"). InnerVolumeSpecName "kube-api-access-z6svq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:30:27 crc kubenswrapper[4753]: I1005 20:30:27.620477 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6svq\" (UniqueName: \"kubernetes.io/projected/89d998ff-850f-4cc5-a430-4671c9a7b68a-kube-api-access-z6svq\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:27 crc kubenswrapper[4753]: I1005 20:30:27.620510 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltwx4\" (UniqueName: \"kubernetes.io/projected/51a13c89-fe4e-42be-88c4-fe420d90f169-kube-api-access-ltwx4\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:28 crc kubenswrapper[4753]: I1005 20:30:28.047106 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sd4c8" event={"ID":"89d998ff-850f-4cc5-a430-4671c9a7b68a","Type":"ContainerDied","Data":"d9364b96524691efa5a56896344a415762c5bebc96919ad55cb2e70677a22e12"} Oct 05 20:30:28 crc kubenswrapper[4753]: I1005 20:30:28.047158 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9364b96524691efa5a56896344a415762c5bebc96919ad55cb2e70677a22e12" Oct 05 20:30:28 crc kubenswrapper[4753]: I1005 20:30:28.047210 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sd4c8" Oct 05 20:30:28 crc kubenswrapper[4753]: I1005 20:30:28.050193 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7h44h" event={"ID":"51a13c89-fe4e-42be-88c4-fe420d90f169","Type":"ContainerDied","Data":"a13ccfb7fe30faad834179d52229e4b3d47911a882de27ed571aafa84337ff8b"} Oct 05 20:30:28 crc kubenswrapper[4753]: I1005 20:30:28.050216 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a13ccfb7fe30faad834179d52229e4b3d47911a882de27ed571aafa84337ff8b" Oct 05 20:30:28 crc kubenswrapper[4753]: I1005 20:30:28.050244 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7h44h" Oct 05 20:30:30 crc kubenswrapper[4753]: I1005 20:30:30.174210 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b795-account-create-6hggf"] Oct 05 20:30:30 crc kubenswrapper[4753]: E1005 20:30:30.175527 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a13c89-fe4e-42be-88c4-fe420d90f169" containerName="mariadb-database-create" Oct 05 20:30:30 crc kubenswrapper[4753]: I1005 20:30:30.175628 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a13c89-fe4e-42be-88c4-fe420d90f169" containerName="mariadb-database-create" Oct 05 20:30:30 crc kubenswrapper[4753]: E1005 20:30:30.175746 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89d998ff-850f-4cc5-a430-4671c9a7b68a" containerName="mariadb-database-create" Oct 05 20:30:30 crc kubenswrapper[4753]: I1005 20:30:30.175825 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="89d998ff-850f-4cc5-a430-4671c9a7b68a" containerName="mariadb-database-create" Oct 05 20:30:30 crc kubenswrapper[4753]: I1005 20:30:30.176107 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="89d998ff-850f-4cc5-a430-4671c9a7b68a" containerName="mariadb-database-create" Oct 05 20:30:30 crc kubenswrapper[4753]: I1005 20:30:30.176241 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a13c89-fe4e-42be-88c4-fe420d90f169" containerName="mariadb-database-create" Oct 05 20:30:30 crc kubenswrapper[4753]: I1005 20:30:30.176935 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b795-account-create-6hggf" Oct 05 20:30:30 crc kubenswrapper[4753]: I1005 20:30:30.180034 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 05 20:30:30 crc kubenswrapper[4753]: I1005 20:30:30.190570 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-778fq\" (UniqueName: \"kubernetes.io/projected/7f079c8d-3df8-41e9-a6f6-fd47d276928c-kube-api-access-778fq\") pod \"glance-b795-account-create-6hggf\" (UID: \"7f079c8d-3df8-41e9-a6f6-fd47d276928c\") " pod="openstack/glance-b795-account-create-6hggf" Oct 05 20:30:30 crc kubenswrapper[4753]: I1005 20:30:30.193848 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b795-account-create-6hggf"] Oct 05 20:30:30 crc kubenswrapper[4753]: I1005 20:30:30.292473 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-778fq\" (UniqueName: \"kubernetes.io/projected/7f079c8d-3df8-41e9-a6f6-fd47d276928c-kube-api-access-778fq\") pod \"glance-b795-account-create-6hggf\" (UID: \"7f079c8d-3df8-41e9-a6f6-fd47d276928c\") " pod="openstack/glance-b795-account-create-6hggf" Oct 05 20:30:30 crc kubenswrapper[4753]: I1005 20:30:30.313945 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-778fq\" (UniqueName: \"kubernetes.io/projected/7f079c8d-3df8-41e9-a6f6-fd47d276928c-kube-api-access-778fq\") pod \"glance-b795-account-create-6hggf\" (UID: \"7f079c8d-3df8-41e9-a6f6-fd47d276928c\") " pod="openstack/glance-b795-account-create-6hggf" Oct 05 20:30:30 crc kubenswrapper[4753]: I1005 20:30:30.504563 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b795-account-create-6hggf" Oct 05 20:30:30 crc kubenswrapper[4753]: I1005 20:30:30.946246 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b795-account-create-6hggf"] Oct 05 20:30:30 crc kubenswrapper[4753]: W1005 20:30:30.954542 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f079c8d_3df8_41e9_a6f6_fd47d276928c.slice/crio-9f6c65612275df8fbe766dc8c75d5d49473b8c676c9d17b2632af53d28a47c87 WatchSource:0}: Error finding container 9f6c65612275df8fbe766dc8c75d5d49473b8c676c9d17b2632af53d28a47c87: Status 404 returned error can't find the container with id 9f6c65612275df8fbe766dc8c75d5d49473b8c676c9d17b2632af53d28a47c87 Oct 05 20:30:31 crc kubenswrapper[4753]: I1005 20:30:31.087606 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b795-account-create-6hggf" event={"ID":"7f079c8d-3df8-41e9-a6f6-fd47d276928c","Type":"ContainerStarted","Data":"9f6c65612275df8fbe766dc8c75d5d49473b8c676c9d17b2632af53d28a47c87"} Oct 05 20:30:31 crc kubenswrapper[4753]: I1005 20:30:31.780692 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-7zxq7" podUID="61f845cb-9404-421b-b20f-9dee4edd00f8" containerName="ovn-controller" probeResult="failure" output=< Oct 05 20:30:31 crc kubenswrapper[4753]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 05 20:30:31 crc kubenswrapper[4753]: > Oct 05 20:30:32 crc kubenswrapper[4753]: I1005 20:30:32.097186 4753 generic.go:334] "Generic (PLEG): container finished" podID="7f079c8d-3df8-41e9-a6f6-fd47d276928c" containerID="704dde8ea9c6a0cc23bc899d20b63c43f6171e917e1570b4bf6e3dd041497fa8" exitCode=0 Oct 05 20:30:32 crc kubenswrapper[4753]: I1005 20:30:32.097244 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b795-account-create-6hggf" event={"ID":"7f079c8d-3df8-41e9-a6f6-fd47d276928c","Type":"ContainerDied","Data":"704dde8ea9c6a0cc23bc899d20b63c43f6171e917e1570b4bf6e3dd041497fa8"} Oct 05 20:30:33 crc kubenswrapper[4753]: I1005 20:30:33.488731 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b795-account-create-6hggf" Oct 05 20:30:33 crc kubenswrapper[4753]: I1005 20:30:33.543290 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-778fq\" (UniqueName: \"kubernetes.io/projected/7f079c8d-3df8-41e9-a6f6-fd47d276928c-kube-api-access-778fq\") pod \"7f079c8d-3df8-41e9-a6f6-fd47d276928c\" (UID: \"7f079c8d-3df8-41e9-a6f6-fd47d276928c\") " Oct 05 20:30:33 crc kubenswrapper[4753]: I1005 20:30:33.551675 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f079c8d-3df8-41e9-a6f6-fd47d276928c-kube-api-access-778fq" (OuterVolumeSpecName: "kube-api-access-778fq") pod "7f079c8d-3df8-41e9-a6f6-fd47d276928c" (UID: "7f079c8d-3df8-41e9-a6f6-fd47d276928c"). InnerVolumeSpecName "kube-api-access-778fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:30:33 crc kubenswrapper[4753]: I1005 20:30:33.645709 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-778fq\" (UniqueName: \"kubernetes.io/projected/7f079c8d-3df8-41e9-a6f6-fd47d276928c-kube-api-access-778fq\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.133110 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b795-account-create-6hggf" event={"ID":"7f079c8d-3df8-41e9-a6f6-fd47d276928c","Type":"ContainerDied","Data":"9f6c65612275df8fbe766dc8c75d5d49473b8c676c9d17b2632af53d28a47c87"} Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.133578 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f6c65612275df8fbe766dc8c75d5d49473b8c676c9d17b2632af53d28a47c87" Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.133217 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b795-account-create-6hggf" Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.489651 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.489704 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.499108 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b3c3-account-create-9gh57"] Oct 05 20:30:34 crc kubenswrapper[4753]: E1005 20:30:34.499430 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f079c8d-3df8-41e9-a6f6-fd47d276928c" containerName="mariadb-account-create" Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.499445 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f079c8d-3df8-41e9-a6f6-fd47d276928c" containerName="mariadb-account-create" Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.499605 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f079c8d-3df8-41e9-a6f6-fd47d276928c" containerName="mariadb-account-create" Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.500112 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b3c3-account-create-9gh57" Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.506466 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.516357 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b3c3-account-create-9gh57"] Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.561685 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9hqw\" (UniqueName: \"kubernetes.io/projected/fd9a587b-445e-4063-bc04-479ea77c77bc-kube-api-access-f9hqw\") pod \"keystone-b3c3-account-create-9gh57\" (UID: \"fd9a587b-445e-4063-bc04-479ea77c77bc\") " pod="openstack/keystone-b3c3-account-create-9gh57" Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.663717 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9hqw\" (UniqueName: \"kubernetes.io/projected/fd9a587b-445e-4063-bc04-479ea77c77bc-kube-api-access-f9hqw\") pod \"keystone-b3c3-account-create-9gh57\" (UID: \"fd9a587b-445e-4063-bc04-479ea77c77bc\") " pod="openstack/keystone-b3c3-account-create-9gh57" Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.690816 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9hqw\" (UniqueName: \"kubernetes.io/projected/fd9a587b-445e-4063-bc04-479ea77c77bc-kube-api-access-f9hqw\") pod \"keystone-b3c3-account-create-9gh57\" (UID: \"fd9a587b-445e-4063-bc04-479ea77c77bc\") " pod="openstack/keystone-b3c3-account-create-9gh57" Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.817354 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b3c3-account-create-9gh57" Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.922632 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2847-account-create-l4pv6"] Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.923786 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2847-account-create-l4pv6" Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.928763 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 05 20:30:34 crc kubenswrapper[4753]: I1005 20:30:34.935409 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2847-account-create-l4pv6"] Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.069344 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26t88\" (UniqueName: \"kubernetes.io/projected/97041ad3-4fe1-475b-8b49-54c1ccab26d8-kube-api-access-26t88\") pod \"placement-2847-account-create-l4pv6\" (UID: \"97041ad3-4fe1-475b-8b49-54c1ccab26d8\") " pod="openstack/placement-2847-account-create-l4pv6" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.172735 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26t88\" (UniqueName: \"kubernetes.io/projected/97041ad3-4fe1-475b-8b49-54c1ccab26d8-kube-api-access-26t88\") pod \"placement-2847-account-create-l4pv6\" (UID: \"97041ad3-4fe1-475b-8b49-54c1ccab26d8\") " pod="openstack/placement-2847-account-create-l4pv6" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.192188 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26t88\" (UniqueName: \"kubernetes.io/projected/97041ad3-4fe1-475b-8b49-54c1ccab26d8-kube-api-access-26t88\") pod \"placement-2847-account-create-l4pv6\" (UID: \"97041ad3-4fe1-475b-8b49-54c1ccab26d8\") " pod="openstack/placement-2847-account-create-l4pv6" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.251585 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2847-account-create-l4pv6" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.268102 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b3c3-account-create-9gh57"] Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.324648 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-ncbw7"] Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.328354 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ncbw7" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.330701 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.343174 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9sx8b" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.343751 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ncbw7"] Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.391896 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-combined-ca-bundle\") pod \"glance-db-sync-ncbw7\" (UID: \"0521eb28-0c37-447f-a80c-60e5187098a5\") " pod="openstack/glance-db-sync-ncbw7" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.391962 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-config-data\") pod \"glance-db-sync-ncbw7\" (UID: \"0521eb28-0c37-447f-a80c-60e5187098a5\") " pod="openstack/glance-db-sync-ncbw7" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.392237 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-db-sync-config-data\") pod \"glance-db-sync-ncbw7\" (UID: \"0521eb28-0c37-447f-a80c-60e5187098a5\") " pod="openstack/glance-db-sync-ncbw7" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.392378 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q26ht\" (UniqueName: \"kubernetes.io/projected/0521eb28-0c37-447f-a80c-60e5187098a5-kube-api-access-q26ht\") pod \"glance-db-sync-ncbw7\" (UID: \"0521eb28-0c37-447f-a80c-60e5187098a5\") " pod="openstack/glance-db-sync-ncbw7" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.495045 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-db-sync-config-data\") pod \"glance-db-sync-ncbw7\" (UID: \"0521eb28-0c37-447f-a80c-60e5187098a5\") " pod="openstack/glance-db-sync-ncbw7" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.495128 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q26ht\" (UniqueName: \"kubernetes.io/projected/0521eb28-0c37-447f-a80c-60e5187098a5-kube-api-access-q26ht\") pod \"glance-db-sync-ncbw7\" (UID: \"0521eb28-0c37-447f-a80c-60e5187098a5\") " pod="openstack/glance-db-sync-ncbw7" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.495163 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-combined-ca-bundle\") pod \"glance-db-sync-ncbw7\" (UID: \"0521eb28-0c37-447f-a80c-60e5187098a5\") " pod="openstack/glance-db-sync-ncbw7" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.495184 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-config-data\") pod \"glance-db-sync-ncbw7\" (UID: \"0521eb28-0c37-447f-a80c-60e5187098a5\") " pod="openstack/glance-db-sync-ncbw7" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.501210 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-config-data\") pod \"glance-db-sync-ncbw7\" (UID: \"0521eb28-0c37-447f-a80c-60e5187098a5\") " pod="openstack/glance-db-sync-ncbw7" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.501536 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-db-sync-config-data\") pod \"glance-db-sync-ncbw7\" (UID: \"0521eb28-0c37-447f-a80c-60e5187098a5\") " pod="openstack/glance-db-sync-ncbw7" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.504388 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-combined-ca-bundle\") pod \"glance-db-sync-ncbw7\" (UID: \"0521eb28-0c37-447f-a80c-60e5187098a5\") " pod="openstack/glance-db-sync-ncbw7" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.513824 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q26ht\" (UniqueName: \"kubernetes.io/projected/0521eb28-0c37-447f-a80c-60e5187098a5-kube-api-access-q26ht\") pod \"glance-db-sync-ncbw7\" (UID: \"0521eb28-0c37-447f-a80c-60e5187098a5\") " pod="openstack/glance-db-sync-ncbw7" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.649738 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ncbw7" Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.755022 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2847-account-create-l4pv6"] Oct 05 20:30:35 crc kubenswrapper[4753]: I1005 20:30:35.968211 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ncbw7"] Oct 05 20:30:35 crc kubenswrapper[4753]: W1005 20:30:35.972427 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0521eb28_0c37_447f_a80c_60e5187098a5.slice/crio-533c1417504310f7461a1200ef52b79cddcc653f2d5df08b60b132bd9b649caf WatchSource:0}: Error finding container 533c1417504310f7461a1200ef52b79cddcc653f2d5df08b60b132bd9b649caf: Status 404 returned error can't find the container with id 533c1417504310f7461a1200ef52b79cddcc653f2d5df08b60b132bd9b649caf Oct 05 20:30:36 crc kubenswrapper[4753]: I1005 20:30:36.154877 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ncbw7" event={"ID":"0521eb28-0c37-447f-a80c-60e5187098a5","Type":"ContainerStarted","Data":"533c1417504310f7461a1200ef52b79cddcc653f2d5df08b60b132bd9b649caf"} Oct 05 20:30:36 crc kubenswrapper[4753]: I1005 20:30:36.159691 4753 generic.go:334] "Generic (PLEG): container finished" podID="97041ad3-4fe1-475b-8b49-54c1ccab26d8" containerID="4e289b5008df17d6e9d98804a7f7af075fa15472ce1e33968ea7c587cf5f51ef" exitCode=0 Oct 05 20:30:36 crc kubenswrapper[4753]: I1005 20:30:36.159777 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2847-account-create-l4pv6" event={"ID":"97041ad3-4fe1-475b-8b49-54c1ccab26d8","Type":"ContainerDied","Data":"4e289b5008df17d6e9d98804a7f7af075fa15472ce1e33968ea7c587cf5f51ef"} Oct 05 20:30:36 crc kubenswrapper[4753]: I1005 20:30:36.159806 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2847-account-create-l4pv6" event={"ID":"97041ad3-4fe1-475b-8b49-54c1ccab26d8","Type":"ContainerStarted","Data":"1ac8f10c9b92620b50bb0fb12803bff812c6c95b58a5e6a2d07227eb510f179f"} Oct 05 20:30:36 crc kubenswrapper[4753]: I1005 20:30:36.166172 4753 generic.go:334] "Generic (PLEG): container finished" podID="fd9a587b-445e-4063-bc04-479ea77c77bc" containerID="98c5303f9c6aff00c19f4b5d3fe957abbf04f336cd37531b2f48be5b18d81df3" exitCode=0 Oct 05 20:30:36 crc kubenswrapper[4753]: I1005 20:30:36.166219 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b3c3-account-create-9gh57" event={"ID":"fd9a587b-445e-4063-bc04-479ea77c77bc","Type":"ContainerDied","Data":"98c5303f9c6aff00c19f4b5d3fe957abbf04f336cd37531b2f48be5b18d81df3"} Oct 05 20:30:36 crc kubenswrapper[4753]: I1005 20:30:36.166245 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b3c3-account-create-9gh57" event={"ID":"fd9a587b-445e-4063-bc04-479ea77c77bc","Type":"ContainerStarted","Data":"d23679afc20611f9e82424e2f63fee5bc9be798b3e592878d82ed681d177bdab"} Oct 05 20:30:36 crc kubenswrapper[4753]: I1005 20:30:36.765941 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-7zxq7" podUID="61f845cb-9404-421b-b20f-9dee4edd00f8" containerName="ovn-controller" probeResult="failure" output=< Oct 05 20:30:36 crc kubenswrapper[4753]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 05 20:30:36 crc kubenswrapper[4753]: > Oct 05 20:30:36 crc kubenswrapper[4753]: I1005 20:30:36.841955 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:30:36 crc kubenswrapper[4753]: I1005 20:30:36.844536 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8krg4" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.077957 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7zxq7-config-5pw7s"] Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.079280 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.081350 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.088212 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7zxq7-config-5pw7s"] Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.128995 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7f8d574a-7724-4282-9187-f02e32abc588-additional-scripts\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.129067 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-run\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.129165 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nqts\" (UniqueName: \"kubernetes.io/projected/7f8d574a-7724-4282-9187-f02e32abc588-kube-api-access-8nqts\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.129199 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-run-ovn\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.129222 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f8d574a-7724-4282-9187-f02e32abc588-scripts\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.129423 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-log-ovn\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.230544 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-run\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.230599 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nqts\" (UniqueName: \"kubernetes.io/projected/7f8d574a-7724-4282-9187-f02e32abc588-kube-api-access-8nqts\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.230630 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-run-ovn\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.230649 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f8d574a-7724-4282-9187-f02e32abc588-scripts\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.230700 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-log-ovn\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.230764 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7f8d574a-7724-4282-9187-f02e32abc588-additional-scripts\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.230797 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-run\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.231050 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-run-ovn\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.231540 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7f8d574a-7724-4282-9187-f02e32abc588-additional-scripts\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.231933 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-log-ovn\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.232654 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f8d574a-7724-4282-9187-f02e32abc588-scripts\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.283978 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nqts\" (UniqueName: \"kubernetes.io/projected/7f8d574a-7724-4282-9187-f02e32abc588-kube-api-access-8nqts\") pod \"ovn-controller-7zxq7-config-5pw7s\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.395918 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.572804 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2847-account-create-l4pv6" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.646804 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26t88\" (UniqueName: \"kubernetes.io/projected/97041ad3-4fe1-475b-8b49-54c1ccab26d8-kube-api-access-26t88\") pod \"97041ad3-4fe1-475b-8b49-54c1ccab26d8\" (UID: \"97041ad3-4fe1-475b-8b49-54c1ccab26d8\") " Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.684457 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97041ad3-4fe1-475b-8b49-54c1ccab26d8-kube-api-access-26t88" (OuterVolumeSpecName: "kube-api-access-26t88") pod "97041ad3-4fe1-475b-8b49-54c1ccab26d8" (UID: "97041ad3-4fe1-475b-8b49-54c1ccab26d8"). InnerVolumeSpecName "kube-api-access-26t88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.748761 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26t88\" (UniqueName: \"kubernetes.io/projected/97041ad3-4fe1-475b-8b49-54c1ccab26d8-kube-api-access-26t88\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.769732 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b3c3-account-create-9gh57" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.851536 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9hqw\" (UniqueName: \"kubernetes.io/projected/fd9a587b-445e-4063-bc04-479ea77c77bc-kube-api-access-f9hqw\") pod \"fd9a587b-445e-4063-bc04-479ea77c77bc\" (UID: \"fd9a587b-445e-4063-bc04-479ea77c77bc\") " Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.854597 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd9a587b-445e-4063-bc04-479ea77c77bc-kube-api-access-f9hqw" (OuterVolumeSpecName: "kube-api-access-f9hqw") pod "fd9a587b-445e-4063-bc04-479ea77c77bc" (UID: "fd9a587b-445e-4063-bc04-479ea77c77bc"). InnerVolumeSpecName "kube-api-access-f9hqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:30:37 crc kubenswrapper[4753]: I1005 20:30:37.955190 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9hqw\" (UniqueName: \"kubernetes.io/projected/fd9a587b-445e-4063-bc04-479ea77c77bc-kube-api-access-f9hqw\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:38 crc kubenswrapper[4753]: I1005 20:30:38.037211 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7zxq7-config-5pw7s"] Oct 05 20:30:38 crc kubenswrapper[4753]: W1005 20:30:38.043042 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f8d574a_7724_4282_9187_f02e32abc588.slice/crio-3e76b6cc80519c172c84a893cb5e243743c2cddf2d4b8a9e5c0e40535befbc04 WatchSource:0}: Error finding container 3e76b6cc80519c172c84a893cb5e243743c2cddf2d4b8a9e5c0e40535befbc04: Status 404 returned error can't find the container with id 3e76b6cc80519c172c84a893cb5e243743c2cddf2d4b8a9e5c0e40535befbc04 Oct 05 20:30:38 crc kubenswrapper[4753]: I1005 20:30:38.185635 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2847-account-create-l4pv6" event={"ID":"97041ad3-4fe1-475b-8b49-54c1ccab26d8","Type":"ContainerDied","Data":"1ac8f10c9b92620b50bb0fb12803bff812c6c95b58a5e6a2d07227eb510f179f"} Oct 05 20:30:38 crc kubenswrapper[4753]: I1005 20:30:38.185679 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ac8f10c9b92620b50bb0fb12803bff812c6c95b58a5e6a2d07227eb510f179f" Oct 05 20:30:38 crc kubenswrapper[4753]: I1005 20:30:38.185636 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2847-account-create-l4pv6" Oct 05 20:30:38 crc kubenswrapper[4753]: I1005 20:30:38.187079 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7zxq7-config-5pw7s" event={"ID":"7f8d574a-7724-4282-9187-f02e32abc588","Type":"ContainerStarted","Data":"3e76b6cc80519c172c84a893cb5e243743c2cddf2d4b8a9e5c0e40535befbc04"} Oct 05 20:30:38 crc kubenswrapper[4753]: I1005 20:30:38.189125 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b3c3-account-create-9gh57" event={"ID":"fd9a587b-445e-4063-bc04-479ea77c77bc","Type":"ContainerDied","Data":"d23679afc20611f9e82424e2f63fee5bc9be798b3e592878d82ed681d177bdab"} Oct 05 20:30:38 crc kubenswrapper[4753]: I1005 20:30:38.189174 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d23679afc20611f9e82424e2f63fee5bc9be798b3e592878d82ed681d177bdab" Oct 05 20:30:38 crc kubenswrapper[4753]: I1005 20:30:38.189175 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b3c3-account-create-9gh57" Oct 05 20:30:39 crc kubenswrapper[4753]: I1005 20:30:39.226437 4753 generic.go:334] "Generic (PLEG): container finished" podID="7f8d574a-7724-4282-9187-f02e32abc588" containerID="b6cb35863e1bfd69e973067627d866681726e82ae091c618cd255b4a866380e2" exitCode=0 Oct 05 20:30:39 crc kubenswrapper[4753]: I1005 20:30:39.226871 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7zxq7-config-5pw7s" event={"ID":"7f8d574a-7724-4282-9187-f02e32abc588","Type":"ContainerDied","Data":"b6cb35863e1bfd69e973067627d866681726e82ae091c618cd255b4a866380e2"} Oct 05 20:30:39 crc kubenswrapper[4753]: I1005 20:30:39.964203 4753 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod6450cdbe-01f9-4878-89cf-28c2a1d53fa0"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod6450cdbe-01f9-4878-89cf-28c2a1d53fa0] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6450cdbe_01f9_4878_89cf_28c2a1d53fa0.slice" Oct 05 20:30:39 crc kubenswrapper[4753]: E1005 20:30:39.964249 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod6450cdbe-01f9-4878-89cf-28c2a1d53fa0] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod6450cdbe-01f9-4878-89cf-28c2a1d53fa0] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6450cdbe_01f9_4878_89cf_28c2a1d53fa0.slice" pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" podUID="6450cdbe-01f9-4878-89cf-28c2a1d53fa0" Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.235639 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b8749979c-dkzzq" Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.289253 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b8749979c-dkzzq"] Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.297765 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b8749979c-dkzzq"] Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.553035 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.696393 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-run-ovn\") pod \"7f8d574a-7724-4282-9187-f02e32abc588\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.696490 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7f8d574a-7724-4282-9187-f02e32abc588" (UID: "7f8d574a-7724-4282-9187-f02e32abc588"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.696617 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f8d574a-7724-4282-9187-f02e32abc588-scripts\") pod \"7f8d574a-7724-4282-9187-f02e32abc588\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.696702 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nqts\" (UniqueName: \"kubernetes.io/projected/7f8d574a-7724-4282-9187-f02e32abc588-kube-api-access-8nqts\") pod \"7f8d574a-7724-4282-9187-f02e32abc588\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.696790 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7f8d574a-7724-4282-9187-f02e32abc588-additional-scripts\") pod \"7f8d574a-7724-4282-9187-f02e32abc588\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.696815 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-log-ovn\") pod \"7f8d574a-7724-4282-9187-f02e32abc588\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.696937 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-run\") pod \"7f8d574a-7724-4282-9187-f02e32abc588\" (UID: \"7f8d574a-7724-4282-9187-f02e32abc588\") " Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.697204 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7f8d574a-7724-4282-9187-f02e32abc588" (UID: "7f8d574a-7724-4282-9187-f02e32abc588"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.697463 4753 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.697488 4753 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.697452 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-run" (OuterVolumeSpecName: "var-run") pod "7f8d574a-7724-4282-9187-f02e32abc588" (UID: "7f8d574a-7724-4282-9187-f02e32abc588"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.697865 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f8d574a-7724-4282-9187-f02e32abc588-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "7f8d574a-7724-4282-9187-f02e32abc588" (UID: "7f8d574a-7724-4282-9187-f02e32abc588"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.697931 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f8d574a-7724-4282-9187-f02e32abc588-scripts" (OuterVolumeSpecName: "scripts") pod "7f8d574a-7724-4282-9187-f02e32abc588" (UID: "7f8d574a-7724-4282-9187-f02e32abc588"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.706400 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f8d574a-7724-4282-9187-f02e32abc588-kube-api-access-8nqts" (OuterVolumeSpecName: "kube-api-access-8nqts") pod "7f8d574a-7724-4282-9187-f02e32abc588" (UID: "7f8d574a-7724-4282-9187-f02e32abc588"). InnerVolumeSpecName "kube-api-access-8nqts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.799786 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f8d574a-7724-4282-9187-f02e32abc588-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.799825 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nqts\" (UniqueName: \"kubernetes.io/projected/7f8d574a-7724-4282-9187-f02e32abc588-kube-api-access-8nqts\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.799843 4753 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7f8d574a-7724-4282-9187-f02e32abc588-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:40 crc kubenswrapper[4753]: I1005 20:30:40.799856 4753 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f8d574a-7724-4282-9187-f02e32abc588-var-run\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.243893 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7zxq7-config-5pw7s" event={"ID":"7f8d574a-7724-4282-9187-f02e32abc588","Type":"ContainerDied","Data":"3e76b6cc80519c172c84a893cb5e243743c2cddf2d4b8a9e5c0e40535befbc04"} Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.244381 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e76b6cc80519c172c84a893cb5e243743c2cddf2d4b8a9e5c0e40535befbc04" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.243950 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7zxq7-config-5pw7s" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.634283 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.653032 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-7zxq7-config-5pw7s"] Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.671471 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-7zxq7-config-5pw7s"] Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.759351 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7zxq7-config-6pk7r"] Oct 05 20:30:41 crc kubenswrapper[4753]: E1005 20:30:41.759713 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97041ad3-4fe1-475b-8b49-54c1ccab26d8" containerName="mariadb-account-create" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.759725 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="97041ad3-4fe1-475b-8b49-54c1ccab26d8" containerName="mariadb-account-create" Oct 05 20:30:41 crc kubenswrapper[4753]: E1005 20:30:41.759750 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8d574a-7724-4282-9187-f02e32abc588" containerName="ovn-config" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.759757 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8d574a-7724-4282-9187-f02e32abc588" containerName="ovn-config" Oct 05 20:30:41 crc kubenswrapper[4753]: E1005 20:30:41.759774 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9a587b-445e-4063-bc04-479ea77c77bc" containerName="mariadb-account-create" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.759780 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9a587b-445e-4063-bc04-479ea77c77bc" containerName="mariadb-account-create" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.759944 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="97041ad3-4fe1-475b-8b49-54c1ccab26d8" containerName="mariadb-account-create" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.759960 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9a587b-445e-4063-bc04-479ea77c77bc" containerName="mariadb-account-create" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.759967 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f8d574a-7724-4282-9187-f02e32abc588" containerName="ovn-config" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.760551 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.763102 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.771496 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7zxq7-config-6pk7r"] Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.797332 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-7zxq7" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.864113 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6450cdbe-01f9-4878-89cf-28c2a1d53fa0" path="/var/lib/kubelet/pods/6450cdbe-01f9-4878-89cf-28c2a1d53fa0/volumes" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.865093 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f8d574a-7724-4282-9187-f02e32abc588" path="/var/lib/kubelet/pods/7f8d574a-7724-4282-9187-f02e32abc588/volumes" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.887596 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.917463 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcwhg\" (UniqueName: \"kubernetes.io/projected/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-kube-api-access-vcwhg\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.917536 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-run\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.917629 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-scripts\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.917658 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-additional-scripts\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.917753 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-run-ovn\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:41 crc kubenswrapper[4753]: I1005 20:30:41.917778 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-log-ovn\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:42 crc kubenswrapper[4753]: I1005 20:30:42.019093 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-run-ovn\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:42 crc kubenswrapper[4753]: I1005 20:30:42.019156 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-log-ovn\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:42 crc kubenswrapper[4753]: I1005 20:30:42.019192 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcwhg\" (UniqueName: \"kubernetes.io/projected/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-kube-api-access-vcwhg\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:42 crc kubenswrapper[4753]: I1005 20:30:42.019227 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-run\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:42 crc kubenswrapper[4753]: I1005 20:30:42.019325 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-scripts\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:42 crc kubenswrapper[4753]: I1005 20:30:42.019340 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-additional-scripts\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:42 crc kubenswrapper[4753]: I1005 20:30:42.019794 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-run-ovn\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:42 crc kubenswrapper[4753]: I1005 20:30:42.019976 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-log-ovn\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:42 crc kubenswrapper[4753]: I1005 20:30:42.020380 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-run\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:42 crc kubenswrapper[4753]: I1005 20:30:42.021229 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-additional-scripts\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:42 crc kubenswrapper[4753]: I1005 20:30:42.022227 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-scripts\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:42 crc kubenswrapper[4753]: I1005 20:30:42.037301 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcwhg\" (UniqueName: \"kubernetes.io/projected/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-kube-api-access-vcwhg\") pod \"ovn-controller-7zxq7-config-6pk7r\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:42 crc kubenswrapper[4753]: I1005 20:30:42.084211 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:42 crc kubenswrapper[4753]: I1005 20:30:42.609252 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7zxq7-config-6pk7r"] Oct 05 20:30:43 crc kubenswrapper[4753]: I1005 20:30:43.976855 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-8mgfs"] Oct 05 20:30:43 crc kubenswrapper[4753]: I1005 20:30:43.978047 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8mgfs" Oct 05 20:30:43 crc kubenswrapper[4753]: I1005 20:30:43.987855 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8mgfs"] Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.081325 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-jwnks"] Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.082307 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jwnks" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.094723 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jwnks"] Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.159110 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6sxn\" (UniqueName: \"kubernetes.io/projected/9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6-kube-api-access-x6sxn\") pod \"cinder-db-create-8mgfs\" (UID: \"9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6\") " pod="openstack/cinder-db-create-8mgfs" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.237302 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-mlx4p"] Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.238636 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mlx4p" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.242439 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.242702 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.243047 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.245330 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2mmgx" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.245604 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mlx4p"] Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.261545 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzxnj\" (UniqueName: \"kubernetes.io/projected/389992bc-e15e-444e-b7ca-7f7212ad86d0-kube-api-access-wzxnj\") pod \"barbican-db-create-jwnks\" (UID: \"389992bc-e15e-444e-b7ca-7f7212ad86d0\") " pod="openstack/barbican-db-create-jwnks" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.261736 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6sxn\" (UniqueName: \"kubernetes.io/projected/9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6-kube-api-access-x6sxn\") pod \"cinder-db-create-8mgfs\" (UID: \"9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6\") " pod="openstack/cinder-db-create-8mgfs" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.284243 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6sxn\" (UniqueName: \"kubernetes.io/projected/9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6-kube-api-access-x6sxn\") pod \"cinder-db-create-8mgfs\" (UID: \"9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6\") " pod="openstack/cinder-db-create-8mgfs" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.292963 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8mgfs" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.362748 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzxnj\" (UniqueName: \"kubernetes.io/projected/389992bc-e15e-444e-b7ca-7f7212ad86d0-kube-api-access-wzxnj\") pod \"barbican-db-create-jwnks\" (UID: \"389992bc-e15e-444e-b7ca-7f7212ad86d0\") " pod="openstack/barbican-db-create-jwnks" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.363132 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpclh\" (UniqueName: \"kubernetes.io/projected/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-kube-api-access-mpclh\") pod \"keystone-db-sync-mlx4p\" (UID: \"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba\") " pod="openstack/keystone-db-sync-mlx4p" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.363202 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-combined-ca-bundle\") pod \"keystone-db-sync-mlx4p\" (UID: \"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba\") " pod="openstack/keystone-db-sync-mlx4p" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.363223 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-config-data\") pod \"keystone-db-sync-mlx4p\" (UID: \"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba\") " pod="openstack/keystone-db-sync-mlx4p" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.381167 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-zbs5q"] Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.382208 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zbs5q" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.396955 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzxnj\" (UniqueName: \"kubernetes.io/projected/389992bc-e15e-444e-b7ca-7f7212ad86d0-kube-api-access-wzxnj\") pod \"barbican-db-create-jwnks\" (UID: \"389992bc-e15e-444e-b7ca-7f7212ad86d0\") " pod="openstack/barbican-db-create-jwnks" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.403716 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zbs5q"] Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.405982 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jwnks" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.465060 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-combined-ca-bundle\") pod \"keystone-db-sync-mlx4p\" (UID: \"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba\") " pod="openstack/keystone-db-sync-mlx4p" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.465107 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-config-data\") pod \"keystone-db-sync-mlx4p\" (UID: \"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba\") " pod="openstack/keystone-db-sync-mlx4p" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.465221 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpclh\" (UniqueName: \"kubernetes.io/projected/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-kube-api-access-mpclh\") pod \"keystone-db-sync-mlx4p\" (UID: \"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba\") " pod="openstack/keystone-db-sync-mlx4p" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.468011 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-combined-ca-bundle\") pod \"keystone-db-sync-mlx4p\" (UID: \"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba\") " pod="openstack/keystone-db-sync-mlx4p" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.482742 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-config-data\") pod \"keystone-db-sync-mlx4p\" (UID: \"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba\") " pod="openstack/keystone-db-sync-mlx4p" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.492522 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpclh\" (UniqueName: \"kubernetes.io/projected/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-kube-api-access-mpclh\") pod \"keystone-db-sync-mlx4p\" (UID: \"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba\") " pod="openstack/keystone-db-sync-mlx4p" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.553033 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mlx4p" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.567013 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppt2s\" (UniqueName: \"kubernetes.io/projected/8bc975d7-0761-473f-8fa7-d9f0980ac9bc-kube-api-access-ppt2s\") pod \"neutron-db-create-zbs5q\" (UID: \"8bc975d7-0761-473f-8fa7-d9f0980ac9bc\") " pod="openstack/neutron-db-create-zbs5q" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.668038 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppt2s\" (UniqueName: \"kubernetes.io/projected/8bc975d7-0761-473f-8fa7-d9f0980ac9bc-kube-api-access-ppt2s\") pod \"neutron-db-create-zbs5q\" (UID: \"8bc975d7-0761-473f-8fa7-d9f0980ac9bc\") " pod="openstack/neutron-db-create-zbs5q" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.684938 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppt2s\" (UniqueName: \"kubernetes.io/projected/8bc975d7-0761-473f-8fa7-d9f0980ac9bc-kube-api-access-ppt2s\") pod \"neutron-db-create-zbs5q\" (UID: \"8bc975d7-0761-473f-8fa7-d9f0980ac9bc\") " pod="openstack/neutron-db-create-zbs5q" Oct 05 20:30:44 crc kubenswrapper[4753]: I1005 20:30:44.733842 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zbs5q" Oct 05 20:30:50 crc kubenswrapper[4753]: I1005 20:30:50.320908 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7zxq7-config-6pk7r" event={"ID":"b848d9c1-5fef-472b-9ab9-ea42bc8123c5","Type":"ContainerStarted","Data":"466a8e4047d0ba4aa738e12f34d8a710a2222e7b80d4534853217600e464e2ab"} Oct 05 20:30:50 crc kubenswrapper[4753]: I1005 20:30:50.749344 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zbs5q"] Oct 05 20:30:50 crc kubenswrapper[4753]: W1005 20:30:50.773092 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bc975d7_0761_473f_8fa7_d9f0980ac9bc.slice/crio-f5dde49a8064e3c44f24c340fb292bc686c7cc43291966e4a035d19b528d80a0 WatchSource:0}: Error finding container f5dde49a8064e3c44f24c340fb292bc686c7cc43291966e4a035d19b528d80a0: Status 404 returned error can't find the container with id f5dde49a8064e3c44f24c340fb292bc686c7cc43291966e4a035d19b528d80a0 Oct 05 20:30:50 crc kubenswrapper[4753]: I1005 20:30:50.775681 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mlx4p"] Oct 05 20:30:50 crc kubenswrapper[4753]: W1005 20:30:50.783074 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ab65a8e_6432_45b7_a2ed_c252ebbe60ba.slice/crio-e4f0c95c51ced13537dd8e0675e8574fe5871b6f6a460028c37e3a30a342ee53 WatchSource:0}: Error finding container e4f0c95c51ced13537dd8e0675e8574fe5871b6f6a460028c37e3a30a342ee53: Status 404 returned error can't find the container with id e4f0c95c51ced13537dd8e0675e8574fe5871b6f6a460028c37e3a30a342ee53 Oct 05 20:30:50 crc kubenswrapper[4753]: I1005 20:30:50.872565 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jwnks"] Oct 05 20:30:50 crc kubenswrapper[4753]: W1005 20:30:50.878346 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod389992bc_e15e_444e_b7ca_7f7212ad86d0.slice/crio-b580c8a854a7d674b8d68527d16759c54d5356373cf32dec9161e7b9d8705a17 WatchSource:0}: Error finding container b580c8a854a7d674b8d68527d16759c54d5356373cf32dec9161e7b9d8705a17: Status 404 returned error can't find the container with id b580c8a854a7d674b8d68527d16759c54d5356373cf32dec9161e7b9d8705a17 Oct 05 20:30:50 crc kubenswrapper[4753]: I1005 20:30:50.934870 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8mgfs"] Oct 05 20:30:50 crc kubenswrapper[4753]: W1005 20:30:50.948182 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e3e2cf1_c36b_4258_a034_4c0a9c5e16e6.slice/crio-b50732e8fce19a41174751c79f0ca106296f605ca30187c820b994a43ffb80c6 WatchSource:0}: Error finding container b50732e8fce19a41174751c79f0ca106296f605ca30187c820b994a43ffb80c6: Status 404 returned error can't find the container with id b50732e8fce19a41174751c79f0ca106296f605ca30187c820b994a43ffb80c6 Oct 05 20:30:51 crc kubenswrapper[4753]: I1005 20:30:51.328967 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mlx4p" event={"ID":"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba","Type":"ContainerStarted","Data":"e4f0c95c51ced13537dd8e0675e8574fe5871b6f6a460028c37e3a30a342ee53"} Oct 05 20:30:51 crc kubenswrapper[4753]: I1005 20:30:51.331220 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jwnks" event={"ID":"389992bc-e15e-444e-b7ca-7f7212ad86d0","Type":"ContainerDied","Data":"f6db53208919ef56c84b301868dcc0be286ca72b29dbd8d81691ad792a187b0e"} Oct 05 20:30:51 crc kubenswrapper[4753]: I1005 20:30:51.331414 4753 generic.go:334] "Generic (PLEG): container finished" podID="389992bc-e15e-444e-b7ca-7f7212ad86d0" containerID="f6db53208919ef56c84b301868dcc0be286ca72b29dbd8d81691ad792a187b0e" exitCode=0 Oct 05 20:30:51 crc kubenswrapper[4753]: I1005 20:30:51.331518 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jwnks" event={"ID":"389992bc-e15e-444e-b7ca-7f7212ad86d0","Type":"ContainerStarted","Data":"b580c8a854a7d674b8d68527d16759c54d5356373cf32dec9161e7b9d8705a17"} Oct 05 20:30:51 crc kubenswrapper[4753]: I1005 20:30:51.332971 4753 generic.go:334] "Generic (PLEG): container finished" podID="8bc975d7-0761-473f-8fa7-d9f0980ac9bc" containerID="64d6643fe6b64a1c4fe1ba8be4cc6a9e0d2abb9890e449bddb75e47717e8e554" exitCode=0 Oct 05 20:30:51 crc kubenswrapper[4753]: I1005 20:30:51.333154 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zbs5q" event={"ID":"8bc975d7-0761-473f-8fa7-d9f0980ac9bc","Type":"ContainerDied","Data":"64d6643fe6b64a1c4fe1ba8be4cc6a9e0d2abb9890e449bddb75e47717e8e554"} Oct 05 20:30:51 crc kubenswrapper[4753]: I1005 20:30:51.333184 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zbs5q" event={"ID":"8bc975d7-0761-473f-8fa7-d9f0980ac9bc","Type":"ContainerStarted","Data":"f5dde49a8064e3c44f24c340fb292bc686c7cc43291966e4a035d19b528d80a0"} Oct 05 20:30:51 crc kubenswrapper[4753]: I1005 20:30:51.335953 4753 generic.go:334] "Generic (PLEG): container finished" podID="9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6" containerID="4f900490c738096c748cb64d00deb888c83e0facbd08c0e7fcf8ce224e1b3553" exitCode=0 Oct 05 20:30:51 crc kubenswrapper[4753]: I1005 20:30:51.336038 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8mgfs" event={"ID":"9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6","Type":"ContainerDied","Data":"4f900490c738096c748cb64d00deb888c83e0facbd08c0e7fcf8ce224e1b3553"} Oct 05 20:30:51 crc kubenswrapper[4753]: I1005 20:30:51.336081 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8mgfs" event={"ID":"9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6","Type":"ContainerStarted","Data":"b50732e8fce19a41174751c79f0ca106296f605ca30187c820b994a43ffb80c6"} Oct 05 20:30:51 crc kubenswrapper[4753]: I1005 20:30:51.337717 4753 generic.go:334] "Generic (PLEG): container finished" podID="b848d9c1-5fef-472b-9ab9-ea42bc8123c5" containerID="ffa3c0b3f8d357022658924fbde7c96b5af854dff028dded3bb2c558e162c90c" exitCode=0 Oct 05 20:30:51 crc kubenswrapper[4753]: I1005 20:30:51.337772 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7zxq7-config-6pk7r" event={"ID":"b848d9c1-5fef-472b-9ab9-ea42bc8123c5","Type":"ContainerDied","Data":"ffa3c0b3f8d357022658924fbde7c96b5af854dff028dded3bb2c558e162c90c"} Oct 05 20:30:51 crc kubenswrapper[4753]: I1005 20:30:51.339455 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ncbw7" event={"ID":"0521eb28-0c37-447f-a80c-60e5187098a5","Type":"ContainerStarted","Data":"f4fd978be212d0546236c9f5f046acd92e193fb91e687e33a566904b6bf49508"} Oct 05 20:30:51 crc kubenswrapper[4753]: I1005 20:30:51.422014 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-ncbw7" podStartSLOduration=2.146618759 podStartE2EDuration="16.421993514s" podCreationTimestamp="2025-10-05 20:30:35 +0000 UTC" firstStartedPulling="2025-10-05 20:30:35.975685658 +0000 UTC m=+944.824013890" lastFinishedPulling="2025-10-05 20:30:50.251060423 +0000 UTC m=+959.099388645" observedRunningTime="2025-10-05 20:30:51.420943821 +0000 UTC m=+960.269272073" watchObservedRunningTime="2025-10-05 20:30:51.421993514 +0000 UTC m=+960.270321746" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.626623 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jwnks" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.709378 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zbs5q" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.728036 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppt2s\" (UniqueName: \"kubernetes.io/projected/8bc975d7-0761-473f-8fa7-d9f0980ac9bc-kube-api-access-ppt2s\") pod \"8bc975d7-0761-473f-8fa7-d9f0980ac9bc\" (UID: \"8bc975d7-0761-473f-8fa7-d9f0980ac9bc\") " Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.733032 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzxnj\" (UniqueName: \"kubernetes.io/projected/389992bc-e15e-444e-b7ca-7f7212ad86d0-kube-api-access-wzxnj\") pod \"389992bc-e15e-444e-b7ca-7f7212ad86d0\" (UID: \"389992bc-e15e-444e-b7ca-7f7212ad86d0\") " Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.736067 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/389992bc-e15e-444e-b7ca-7f7212ad86d0-kube-api-access-wzxnj" (OuterVolumeSpecName: "kube-api-access-wzxnj") pod "389992bc-e15e-444e-b7ca-7f7212ad86d0" (UID: "389992bc-e15e-444e-b7ca-7f7212ad86d0"). InnerVolumeSpecName "kube-api-access-wzxnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.737558 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc975d7-0761-473f-8fa7-d9f0980ac9bc-kube-api-access-ppt2s" (OuterVolumeSpecName: "kube-api-access-ppt2s") pod "8bc975d7-0761-473f-8fa7-d9f0980ac9bc" (UID: "8bc975d7-0761-473f-8fa7-d9f0980ac9bc"). InnerVolumeSpecName "kube-api-access-ppt2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.777440 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8mgfs" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.802859 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.834325 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-run-ovn\") pod \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.834400 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-log-ovn\") pod \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.834436 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-scripts\") pod \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.834465 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-additional-scripts\") pod \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.834508 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcwhg\" (UniqueName: \"kubernetes.io/projected/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-kube-api-access-vcwhg\") pod \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.834561 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6sxn\" (UniqueName: \"kubernetes.io/projected/9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6-kube-api-access-x6sxn\") pod \"9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6\" (UID: \"9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6\") " Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.834601 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-run\") pod \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\" (UID: \"b848d9c1-5fef-472b-9ab9-ea42bc8123c5\") " Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.834832 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzxnj\" (UniqueName: \"kubernetes.io/projected/389992bc-e15e-444e-b7ca-7f7212ad86d0-kube-api-access-wzxnj\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.834852 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppt2s\" (UniqueName: \"kubernetes.io/projected/8bc975d7-0761-473f-8fa7-d9f0980ac9bc-kube-api-access-ppt2s\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.834912 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-run" (OuterVolumeSpecName: "var-run") pod "b848d9c1-5fef-472b-9ab9-ea42bc8123c5" (UID: "b848d9c1-5fef-472b-9ab9-ea42bc8123c5"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.834964 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b848d9c1-5fef-472b-9ab9-ea42bc8123c5" (UID: "b848d9c1-5fef-472b-9ab9-ea42bc8123c5"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.834992 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b848d9c1-5fef-472b-9ab9-ea42bc8123c5" (UID: "b848d9c1-5fef-472b-9ab9-ea42bc8123c5"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.836097 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b848d9c1-5fef-472b-9ab9-ea42bc8123c5" (UID: "b848d9c1-5fef-472b-9ab9-ea42bc8123c5"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.836582 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-scripts" (OuterVolumeSpecName: "scripts") pod "b848d9c1-5fef-472b-9ab9-ea42bc8123c5" (UID: "b848d9c1-5fef-472b-9ab9-ea42bc8123c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.838965 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-kube-api-access-vcwhg" (OuterVolumeSpecName: "kube-api-access-vcwhg") pod "b848d9c1-5fef-472b-9ab9-ea42bc8123c5" (UID: "b848d9c1-5fef-472b-9ab9-ea42bc8123c5"). InnerVolumeSpecName "kube-api-access-vcwhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.843827 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6-kube-api-access-x6sxn" (OuterVolumeSpecName: "kube-api-access-x6sxn") pod "9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6" (UID: "9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6"). InnerVolumeSpecName "kube-api-access-x6sxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.935889 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6sxn\" (UniqueName: \"kubernetes.io/projected/9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6-kube-api-access-x6sxn\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.935921 4753 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-run\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.935930 4753 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.935940 4753 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.935949 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.935957 4753 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:52 crc kubenswrapper[4753]: I1005 20:30:52.935966 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcwhg\" (UniqueName: \"kubernetes.io/projected/b848d9c1-5fef-472b-9ab9-ea42bc8123c5-kube-api-access-vcwhg\") on node \"crc\" DevicePath \"\"" Oct 05 20:30:53 crc kubenswrapper[4753]: I1005 20:30:53.356794 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8mgfs" event={"ID":"9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6","Type":"ContainerDied","Data":"b50732e8fce19a41174751c79f0ca106296f605ca30187c820b994a43ffb80c6"} Oct 05 20:30:53 crc kubenswrapper[4753]: I1005 20:30:53.356836 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8mgfs" Oct 05 20:30:53 crc kubenswrapper[4753]: I1005 20:30:53.356842 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b50732e8fce19a41174751c79f0ca106296f605ca30187c820b994a43ffb80c6" Oct 05 20:30:53 crc kubenswrapper[4753]: I1005 20:30:53.358849 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7zxq7-config-6pk7r" Oct 05 20:30:53 crc kubenswrapper[4753]: I1005 20:30:53.358838 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7zxq7-config-6pk7r" event={"ID":"b848d9c1-5fef-472b-9ab9-ea42bc8123c5","Type":"ContainerDied","Data":"466a8e4047d0ba4aa738e12f34d8a710a2222e7b80d4534853217600e464e2ab"} Oct 05 20:30:53 crc kubenswrapper[4753]: I1005 20:30:53.358981 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="466a8e4047d0ba4aa738e12f34d8a710a2222e7b80d4534853217600e464e2ab" Oct 05 20:30:53 crc kubenswrapper[4753]: I1005 20:30:53.360783 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jwnks" event={"ID":"389992bc-e15e-444e-b7ca-7f7212ad86d0","Type":"ContainerDied","Data":"b580c8a854a7d674b8d68527d16759c54d5356373cf32dec9161e7b9d8705a17"} Oct 05 20:30:53 crc kubenswrapper[4753]: I1005 20:30:53.360815 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b580c8a854a7d674b8d68527d16759c54d5356373cf32dec9161e7b9d8705a17" Oct 05 20:30:53 crc kubenswrapper[4753]: I1005 20:30:53.360868 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jwnks" Oct 05 20:30:53 crc kubenswrapper[4753]: I1005 20:30:53.370759 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zbs5q" event={"ID":"8bc975d7-0761-473f-8fa7-d9f0980ac9bc","Type":"ContainerDied","Data":"f5dde49a8064e3c44f24c340fb292bc686c7cc43291966e4a035d19b528d80a0"} Oct 05 20:30:53 crc kubenswrapper[4753]: I1005 20:30:53.370802 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5dde49a8064e3c44f24c340fb292bc686c7cc43291966e4a035d19b528d80a0" Oct 05 20:30:53 crc kubenswrapper[4753]: I1005 20:30:53.370905 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zbs5q" Oct 05 20:30:53 crc kubenswrapper[4753]: I1005 20:30:53.891941 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-7zxq7-config-6pk7r"] Oct 05 20:30:53 crc kubenswrapper[4753]: I1005 20:30:53.898752 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-7zxq7-config-6pk7r"] Oct 05 20:30:55 crc kubenswrapper[4753]: I1005 20:30:55.868337 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b848d9c1-5fef-472b-9ab9-ea42bc8123c5" path="/var/lib/kubelet/pods/b848d9c1-5fef-472b-9ab9-ea42bc8123c5/volumes" Oct 05 20:30:56 crc kubenswrapper[4753]: I1005 20:30:56.397108 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mlx4p" event={"ID":"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba","Type":"ContainerStarted","Data":"bf3403508f00872b209bead25e80f081623e3c861ed59fd3cb93ae2177567de6"} Oct 05 20:30:56 crc kubenswrapper[4753]: I1005 20:30:56.441308 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-mlx4p" podStartSLOduration=7.741382705 podStartE2EDuration="12.441281197s" podCreationTimestamp="2025-10-05 20:30:44 +0000 UTC" firstStartedPulling="2025-10-05 20:30:50.794898738 +0000 UTC m=+959.643226970" lastFinishedPulling="2025-10-05 20:30:55.49479724 +0000 UTC m=+964.343125462" observedRunningTime="2025-10-05 20:30:56.414190401 +0000 UTC m=+965.262518633" watchObservedRunningTime="2025-10-05 20:30:56.441281197 +0000 UTC m=+965.289609439" Oct 05 20:30:58 crc kubenswrapper[4753]: I1005 20:30:58.416500 4753 generic.go:334] "Generic (PLEG): container finished" podID="1ab65a8e-6432-45b7-a2ed-c252ebbe60ba" containerID="bf3403508f00872b209bead25e80f081623e3c861ed59fd3cb93ae2177567de6" exitCode=0 Oct 05 20:30:58 crc kubenswrapper[4753]: I1005 20:30:58.416565 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mlx4p" event={"ID":"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba","Type":"ContainerDied","Data":"bf3403508f00872b209bead25e80f081623e3c861ed59fd3cb93ae2177567de6"} Oct 05 20:30:59 crc kubenswrapper[4753]: I1005 20:30:59.429285 4753 generic.go:334] "Generic (PLEG): container finished" podID="0521eb28-0c37-447f-a80c-60e5187098a5" containerID="f4fd978be212d0546236c9f5f046acd92e193fb91e687e33a566904b6bf49508" exitCode=0 Oct 05 20:30:59 crc kubenswrapper[4753]: I1005 20:30:59.429398 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ncbw7" event={"ID":"0521eb28-0c37-447f-a80c-60e5187098a5","Type":"ContainerDied","Data":"f4fd978be212d0546236c9f5f046acd92e193fb91e687e33a566904b6bf49508"} Oct 05 20:30:59 crc kubenswrapper[4753]: I1005 20:30:59.800395 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mlx4p" Oct 05 20:30:59 crc kubenswrapper[4753]: I1005 20:30:59.971240 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-config-data\") pod \"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba\" (UID: \"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba\") " Oct 05 20:30:59 crc kubenswrapper[4753]: I1005 20:30:59.971413 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-combined-ca-bundle\") pod \"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba\" (UID: \"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba\") " Oct 05 20:30:59 crc kubenswrapper[4753]: I1005 20:30:59.971792 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpclh\" (UniqueName: \"kubernetes.io/projected/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-kube-api-access-mpclh\") pod \"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba\" (UID: \"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba\") " Oct 05 20:30:59 crc kubenswrapper[4753]: I1005 20:30:59.976505 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-kube-api-access-mpclh" (OuterVolumeSpecName: "kube-api-access-mpclh") pod "1ab65a8e-6432-45b7-a2ed-c252ebbe60ba" (UID: "1ab65a8e-6432-45b7-a2ed-c252ebbe60ba"). InnerVolumeSpecName "kube-api-access-mpclh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:30:59 crc kubenswrapper[4753]: I1005 20:30:59.996676 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ab65a8e-6432-45b7-a2ed-c252ebbe60ba" (UID: "1ab65a8e-6432-45b7-a2ed-c252ebbe60ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.012904 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-config-data" (OuterVolumeSpecName: "config-data") pod "1ab65a8e-6432-45b7-a2ed-c252ebbe60ba" (UID: "1ab65a8e-6432-45b7-a2ed-c252ebbe60ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.073793 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.073829 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpclh\" (UniqueName: \"kubernetes.io/projected/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-kube-api-access-mpclh\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.073843 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.436802 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mlx4p" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.436829 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mlx4p" event={"ID":"1ab65a8e-6432-45b7-a2ed-c252ebbe60ba","Type":"ContainerDied","Data":"e4f0c95c51ced13537dd8e0675e8574fe5871b6f6a460028c37e3a30a342ee53"} Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.437185 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4f0c95c51ced13537dd8e0675e8574fe5871b6f6a460028c37e3a30a342ee53" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.649427 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vt5gp"] Oct 05 20:31:00 crc kubenswrapper[4753]: E1005 20:31:00.649826 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6" containerName="mariadb-database-create" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.649843 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6" containerName="mariadb-database-create" Oct 05 20:31:00 crc kubenswrapper[4753]: E1005 20:31:00.649860 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab65a8e-6432-45b7-a2ed-c252ebbe60ba" containerName="keystone-db-sync" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.649868 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab65a8e-6432-45b7-a2ed-c252ebbe60ba" containerName="keystone-db-sync" Oct 05 20:31:00 crc kubenswrapper[4753]: E1005 20:31:00.649886 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b848d9c1-5fef-472b-9ab9-ea42bc8123c5" containerName="ovn-config" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.649893 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b848d9c1-5fef-472b-9ab9-ea42bc8123c5" containerName="ovn-config" Oct 05 20:31:00 crc kubenswrapper[4753]: E1005 20:31:00.649924 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389992bc-e15e-444e-b7ca-7f7212ad86d0" containerName="mariadb-database-create" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.649932 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="389992bc-e15e-444e-b7ca-7f7212ad86d0" containerName="mariadb-database-create" Oct 05 20:31:00 crc kubenswrapper[4753]: E1005 20:31:00.649945 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc975d7-0761-473f-8fa7-d9f0980ac9bc" containerName="mariadb-database-create" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.649951 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc975d7-0761-473f-8fa7-d9f0980ac9bc" containerName="mariadb-database-create" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.650153 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b848d9c1-5fef-472b-9ab9-ea42bc8123c5" containerName="ovn-config" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.650171 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6" containerName="mariadb-database-create" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.650183 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="389992bc-e15e-444e-b7ca-7f7212ad86d0" containerName="mariadb-database-create" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.650199 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ab65a8e-6432-45b7-a2ed-c252ebbe60ba" containerName="keystone-db-sync" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.650209 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bc975d7-0761-473f-8fa7-d9f0980ac9bc" containerName="mariadb-database-create" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.651033 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.656124 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2mmgx" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.656516 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.656726 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.661062 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.667539 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dc86c69bc-2m5k9"] Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.668940 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.676605 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc86c69bc-2m5k9"] Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.708012 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vt5gp"] Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.786858 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lx2v\" (UniqueName: \"kubernetes.io/projected/7db01ca6-f0c9-4f53-888a-dc2c806310cd-kube-api-access-4lx2v\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.786911 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cnfd\" (UniqueName: \"kubernetes.io/projected/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-kube-api-access-5cnfd\") pod \"dnsmasq-dns-6dc86c69bc-2m5k9\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.786931 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-dns-svc\") pod \"dnsmasq-dns-6dc86c69bc-2m5k9\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.786957 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-config\") pod \"dnsmasq-dns-6dc86c69bc-2m5k9\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.786978 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc86c69bc-2m5k9\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.787045 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-config-data\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.787073 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc86c69bc-2m5k9\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.787174 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-fernet-keys\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.787365 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-credential-keys\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.787402 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-combined-ca-bundle\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.794227 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-scripts\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.896051 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-scripts\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.896117 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lx2v\" (UniqueName: \"kubernetes.io/projected/7db01ca6-f0c9-4f53-888a-dc2c806310cd-kube-api-access-4lx2v\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.896156 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cnfd\" (UniqueName: \"kubernetes.io/projected/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-kube-api-access-5cnfd\") pod \"dnsmasq-dns-6dc86c69bc-2m5k9\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.896174 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-dns-svc\") pod \"dnsmasq-dns-6dc86c69bc-2m5k9\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.896193 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-config\") pod \"dnsmasq-dns-6dc86c69bc-2m5k9\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.896208 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc86c69bc-2m5k9\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.896231 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-config-data\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.896258 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc86c69bc-2m5k9\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.896301 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-fernet-keys\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.896342 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-combined-ca-bundle\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.896356 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-credential-keys\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.897161 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-config\") pod \"dnsmasq-dns-6dc86c69bc-2m5k9\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.898769 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc86c69bc-2m5k9\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.904164 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-credential-keys\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.904726 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-dns-svc\") pod \"dnsmasq-dns-6dc86c69bc-2m5k9\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.905263 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc86c69bc-2m5k9\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.911252 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-scripts\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.913900 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-config-data\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.914561 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-combined-ca-bundle\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.916488 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-fernet-keys\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.938189 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cnfd\" (UniqueName: \"kubernetes.io/projected/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-kube-api-access-5cnfd\") pod \"dnsmasq-dns-6dc86c69bc-2m5k9\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:00 crc kubenswrapper[4753]: I1005 20:31:00.943395 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lx2v\" (UniqueName: \"kubernetes.io/projected/7db01ca6-f0c9-4f53-888a-dc2c806310cd-kube-api-access-4lx2v\") pod \"keystone-bootstrap-vt5gp\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.002608 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.004335 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.006471 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.012482 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.012744 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.020084 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.032150 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.061752 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc86c69bc-2m5k9"] Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.106945 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-config-data\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.106987 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de197a05-86cb-4de6-b4f7-03e27dba1e02-run-httpd\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.107019 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwqfg\" (UniqueName: \"kubernetes.io/projected/de197a05-86cb-4de6-b4f7-03e27dba1e02-kube-api-access-nwqfg\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.107049 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.107091 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-scripts\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.107108 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de197a05-86cb-4de6-b4f7-03e27dba1e02-log-httpd\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.107170 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.118848 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-flzmg"] Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.119915 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.122978 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.123195 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9m55q" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.123319 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.156764 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d5f9978df-kp8gx"] Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.158348 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.162061 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ncbw7" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.176436 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-flzmg"] Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.194376 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d5f9978df-kp8gx"] Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.221050 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-config-data\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.221078 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de197a05-86cb-4de6-b4f7-03e27dba1e02-run-httpd\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.221106 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwqfg\" (UniqueName: \"kubernetes.io/projected/de197a05-86cb-4de6-b4f7-03e27dba1e02-kube-api-access-nwqfg\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.221170 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.221202 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-scripts\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.221221 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de197a05-86cb-4de6-b4f7-03e27dba1e02-log-httpd\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.221256 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.224754 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de197a05-86cb-4de6-b4f7-03e27dba1e02-run-httpd\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.227153 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de197a05-86cb-4de6-b4f7-03e27dba1e02-log-httpd\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.227689 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.236463 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-config-data\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.240602 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.242211 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-scripts\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.249495 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwqfg\" (UniqueName: \"kubernetes.io/projected/de197a05-86cb-4de6-b4f7-03e27dba1e02-kube-api-access-nwqfg\") pod \"ceilometer-0\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.322636 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-combined-ca-bundle\") pod \"0521eb28-0c37-447f-a80c-60e5187098a5\" (UID: \"0521eb28-0c37-447f-a80c-60e5187098a5\") " Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.322729 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q26ht\" (UniqueName: \"kubernetes.io/projected/0521eb28-0c37-447f-a80c-60e5187098a5-kube-api-access-q26ht\") pod \"0521eb28-0c37-447f-a80c-60e5187098a5\" (UID: \"0521eb28-0c37-447f-a80c-60e5187098a5\") " Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.322752 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-db-sync-config-data\") pod \"0521eb28-0c37-447f-a80c-60e5187098a5\" (UID: \"0521eb28-0c37-447f-a80c-60e5187098a5\") " Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.322899 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-config-data\") pod \"0521eb28-0c37-447f-a80c-60e5187098a5\" (UID: \"0521eb28-0c37-447f-a80c-60e5187098a5\") " Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.323100 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-ovsdbserver-sb\") pod \"dnsmasq-dns-d5f9978df-kp8gx\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.323131 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-config\") pod \"dnsmasq-dns-d5f9978df-kp8gx\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.323171 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-logs\") pod \"placement-db-sync-flzmg\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.323211 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-combined-ca-bundle\") pod \"placement-db-sync-flzmg\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.323236 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhb4w\" (UniqueName: \"kubernetes.io/projected/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-kube-api-access-rhb4w\") pod \"dnsmasq-dns-d5f9978df-kp8gx\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.323263 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-dns-svc\") pod \"dnsmasq-dns-d5f9978df-kp8gx\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.323289 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-scripts\") pod \"placement-db-sync-flzmg\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.323309 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-config-data\") pod \"placement-db-sync-flzmg\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.323327 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr5bq\" (UniqueName: \"kubernetes.io/projected/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-kube-api-access-dr5bq\") pod \"placement-db-sync-flzmg\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.323350 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-ovsdbserver-nb\") pod \"dnsmasq-dns-d5f9978df-kp8gx\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.326975 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0521eb28-0c37-447f-a80c-60e5187098a5" (UID: "0521eb28-0c37-447f-a80c-60e5187098a5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.330925 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0521eb28-0c37-447f-a80c-60e5187098a5-kube-api-access-q26ht" (OuterVolumeSpecName: "kube-api-access-q26ht") pod "0521eb28-0c37-447f-a80c-60e5187098a5" (UID: "0521eb28-0c37-447f-a80c-60e5187098a5"). InnerVolumeSpecName "kube-api-access-q26ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.346988 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0521eb28-0c37-447f-a80c-60e5187098a5" (UID: "0521eb28-0c37-447f-a80c-60e5187098a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.406021 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-config-data" (OuterVolumeSpecName: "config-data") pod "0521eb28-0c37-447f-a80c-60e5187098a5" (UID: "0521eb28-0c37-447f-a80c-60e5187098a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.424484 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-combined-ca-bundle\") pod \"placement-db-sync-flzmg\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.424546 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhb4w\" (UniqueName: \"kubernetes.io/projected/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-kube-api-access-rhb4w\") pod \"dnsmasq-dns-d5f9978df-kp8gx\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.424579 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-scripts\") pod \"placement-db-sync-flzmg\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.424596 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-dns-svc\") pod \"dnsmasq-dns-d5f9978df-kp8gx\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.424617 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-config-data\") pod \"placement-db-sync-flzmg\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.424633 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr5bq\" (UniqueName: \"kubernetes.io/projected/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-kube-api-access-dr5bq\") pod \"placement-db-sync-flzmg\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.424655 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-ovsdbserver-nb\") pod \"dnsmasq-dns-d5f9978df-kp8gx\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.424714 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-ovsdbserver-sb\") pod \"dnsmasq-dns-d5f9978df-kp8gx\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.424730 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-config\") pod \"dnsmasq-dns-d5f9978df-kp8gx\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.424751 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-logs\") pod \"placement-db-sync-flzmg\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.424790 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.424801 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.424811 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q26ht\" (UniqueName: \"kubernetes.io/projected/0521eb28-0c37-447f-a80c-60e5187098a5-kube-api-access-q26ht\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.424820 4753 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0521eb28-0c37-447f-a80c-60e5187098a5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.425617 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-ovsdbserver-sb\") pod \"dnsmasq-dns-d5f9978df-kp8gx\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.426007 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-config\") pod \"dnsmasq-dns-d5f9978df-kp8gx\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.426625 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-dns-svc\") pod \"dnsmasq-dns-d5f9978df-kp8gx\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.430115 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-ovsdbserver-nb\") pod \"dnsmasq-dns-d5f9978df-kp8gx\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.430342 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-logs\") pod \"placement-db-sync-flzmg\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.430677 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-scripts\") pod \"placement-db-sync-flzmg\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.430863 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-config-data\") pod \"placement-db-sync-flzmg\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.432211 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-combined-ca-bundle\") pod \"placement-db-sync-flzmg\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.444647 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.448988 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr5bq\" (UniqueName: \"kubernetes.io/projected/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-kube-api-access-dr5bq\") pod \"placement-db-sync-flzmg\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.451829 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhb4w\" (UniqueName: \"kubernetes.io/projected/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-kube-api-access-rhb4w\") pod \"dnsmasq-dns-d5f9978df-kp8gx\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.453206 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ncbw7" event={"ID":"0521eb28-0c37-447f-a80c-60e5187098a5","Type":"ContainerDied","Data":"533c1417504310f7461a1200ef52b79cddcc653f2d5df08b60b132bd9b649caf"} Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.453246 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="533c1417504310f7461a1200ef52b79cddcc653f2d5df08b60b132bd9b649caf" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.453303 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ncbw7" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.483487 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.493497 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.522667 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc86c69bc-2m5k9"] Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.657634 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vt5gp"] Oct 05 20:31:01 crc kubenswrapper[4753]: W1005 20:31:01.710847 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7db01ca6_f0c9_4f53_888a_dc2c806310cd.slice/crio-fb8b1419d91b36c2d9e788e615ff1dcb5f29282ffebeb5251c24154548bdc25f WatchSource:0}: Error finding container fb8b1419d91b36c2d9e788e615ff1dcb5f29282ffebeb5251c24154548bdc25f: Status 404 returned error can't find the container with id fb8b1419d91b36c2d9e788e615ff1dcb5f29282ffebeb5251c24154548bdc25f Oct 05 20:31:01 crc kubenswrapper[4753]: I1005 20:31:01.977761 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d5f9978df-kp8gx"] Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.037070 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d8869ff97-2bt5k"] Oct 05 20:31:02 crc kubenswrapper[4753]: W1005 20:31:02.037330 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde197a05_86cb_4de6_b4f7_03e27dba1e02.slice/crio-988ebece8656993c27c5a1c60af8d23482c97ee990be05e17dfd5789790a54e0 WatchSource:0}: Error finding container 988ebece8656993c27c5a1c60af8d23482c97ee990be05e17dfd5789790a54e0: Status 404 returned error can't find the container with id 988ebece8656993c27c5a1c60af8d23482c97ee990be05e17dfd5789790a54e0 Oct 05 20:31:02 crc kubenswrapper[4753]: E1005 20:31:02.037411 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0521eb28-0c37-447f-a80c-60e5187098a5" containerName="glance-db-sync" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.037424 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="0521eb28-0c37-447f-a80c-60e5187098a5" containerName="glance-db-sync" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.037558 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="0521eb28-0c37-447f-a80c-60e5187098a5" containerName="glance-db-sync" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.041625 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.058418 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.089243 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d8869ff97-2bt5k"] Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.119010 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-dns-svc\") pod \"dnsmasq-dns-d8869ff97-2bt5k\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.119066 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2797l\" (UniqueName: \"kubernetes.io/projected/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-kube-api-access-2797l\") pod \"dnsmasq-dns-d8869ff97-2bt5k\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.119114 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-config\") pod \"dnsmasq-dns-d8869ff97-2bt5k\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.119147 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-ovsdbserver-nb\") pod \"dnsmasq-dns-d8869ff97-2bt5k\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.119180 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-ovsdbserver-sb\") pod \"dnsmasq-dns-d8869ff97-2bt5k\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.221910 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-ovsdbserver-sb\") pod \"dnsmasq-dns-d8869ff97-2bt5k\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.222007 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-dns-svc\") pod \"dnsmasq-dns-d8869ff97-2bt5k\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.222037 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2797l\" (UniqueName: \"kubernetes.io/projected/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-kube-api-access-2797l\") pod \"dnsmasq-dns-d8869ff97-2bt5k\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.222080 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-config\") pod \"dnsmasq-dns-d8869ff97-2bt5k\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.222098 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-ovsdbserver-nb\") pod \"dnsmasq-dns-d8869ff97-2bt5k\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.223127 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-ovsdbserver-nb\") pod \"dnsmasq-dns-d8869ff97-2bt5k\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.223681 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-ovsdbserver-sb\") pod \"dnsmasq-dns-d8869ff97-2bt5k\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.224171 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-dns-svc\") pod \"dnsmasq-dns-d8869ff97-2bt5k\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.224985 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-config\") pod \"dnsmasq-dns-d8869ff97-2bt5k\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.254100 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2797l\" (UniqueName: \"kubernetes.io/projected/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-kube-api-access-2797l\") pod \"dnsmasq-dns-d8869ff97-2bt5k\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.347119 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d5f9978df-kp8gx"] Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.410454 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.419974 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-flzmg"] Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.580248 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vt5gp" event={"ID":"7db01ca6-f0c9-4f53-888a-dc2c806310cd","Type":"ContainerStarted","Data":"6e63058d2cb40f0e763d5fab64e729403984404fc3d63a216d05106ab19053ac"} Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.580291 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vt5gp" event={"ID":"7db01ca6-f0c9-4f53-888a-dc2c806310cd","Type":"ContainerStarted","Data":"fb8b1419d91b36c2d9e788e615ff1dcb5f29282ffebeb5251c24154548bdc25f"} Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.607493 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-flzmg" event={"ID":"3154b1ca-ea81-4fc8-ba7d-ff439e97c930","Type":"ContainerStarted","Data":"7b00f5a821bfc58f9a4cbe137fbbd1b1c6aa8370b0bc24584658bd4a34460ba8"} Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.617124 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vt5gp" podStartSLOduration=2.617103373 podStartE2EDuration="2.617103373s" podCreationTimestamp="2025-10-05 20:31:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:31:02.608500014 +0000 UTC m=+971.456828246" watchObservedRunningTime="2025-10-05 20:31:02.617103373 +0000 UTC m=+971.465431605" Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.620879 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" event={"ID":"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc","Type":"ContainerStarted","Data":"e4484d733b68c1fc9cba0bd31e723b327c21cd334d7b84df24b2e66f5cf8e2ce"} Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.629724 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de197a05-86cb-4de6-b4f7-03e27dba1e02","Type":"ContainerStarted","Data":"988ebece8656993c27c5a1c60af8d23482c97ee990be05e17dfd5789790a54e0"} Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.634591 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" event={"ID":"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e","Type":"ContainerStarted","Data":"a4aff760cd7f03a3a7ea819d5b80a667dd24d33d92fc65fc638a9e2b413eda2a"} Oct 05 20:31:02 crc kubenswrapper[4753]: I1005 20:31:02.634629 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" event={"ID":"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e","Type":"ContainerStarted","Data":"d4b81ade803bddc8f609b6df28774ef9470c382e46dc215d3239bd2293ef02ad"} Oct 05 20:31:03 crc kubenswrapper[4753]: I1005 20:31:03.693301 4753 generic.go:334] "Generic (PLEG): container finished" podID="c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc" containerID="718d6202771eaf57ef0a925cf96d9ca63c2b1c525f89a738d626aff922d1d676" exitCode=0 Oct 05 20:31:03 crc kubenswrapper[4753]: I1005 20:31:03.693874 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" event={"ID":"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc","Type":"ContainerDied","Data":"718d6202771eaf57ef0a925cf96d9ca63c2b1c525f89a738d626aff922d1d676"} Oct 05 20:31:03 crc kubenswrapper[4753]: I1005 20:31:03.716514 4753 generic.go:334] "Generic (PLEG): container finished" podID="aa2fe04c-6412-4f34-96f5-d9fc4af78c9e" containerID="a4aff760cd7f03a3a7ea819d5b80a667dd24d33d92fc65fc638a9e2b413eda2a" exitCode=0 Oct 05 20:31:03 crc kubenswrapper[4753]: I1005 20:31:03.716750 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" event={"ID":"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e","Type":"ContainerDied","Data":"a4aff760cd7f03a3a7ea819d5b80a667dd24d33d92fc65fc638a9e2b413eda2a"} Oct 05 20:31:03 crc kubenswrapper[4753]: I1005 20:31:03.912230 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.003972 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-dns-svc\") pod \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.004043 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-ovsdbserver-sb\") pod \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.004111 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-config\") pod \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.004238 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cnfd\" (UniqueName: \"kubernetes.io/projected/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-kube-api-access-5cnfd\") pod \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.004305 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-ovsdbserver-nb\") pod \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\" (UID: \"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e\") " Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.030612 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.040944 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-kube-api-access-5cnfd" (OuterVolumeSpecName: "kube-api-access-5cnfd") pod "aa2fe04c-6412-4f34-96f5-d9fc4af78c9e" (UID: "aa2fe04c-6412-4f34-96f5-d9fc4af78c9e"). InnerVolumeSpecName "kube-api-access-5cnfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.069187 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa2fe04c-6412-4f34-96f5-d9fc4af78c9e" (UID: "aa2fe04c-6412-4f34-96f5-d9fc4af78c9e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.084883 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa2fe04c-6412-4f34-96f5-d9fc4af78c9e" (UID: "aa2fe04c-6412-4f34-96f5-d9fc4af78c9e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.086179 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d8869ff97-2bt5k"] Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.113565 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.113585 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cnfd\" (UniqueName: \"kubernetes.io/projected/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-kube-api-access-5cnfd\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.113596 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.114754 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa2fe04c-6412-4f34-96f5-d9fc4af78c9e" (UID: "aa2fe04c-6412-4f34-96f5-d9fc4af78c9e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.138539 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-config" (OuterVolumeSpecName: "config") pod "aa2fe04c-6412-4f34-96f5-d9fc4af78c9e" (UID: "aa2fe04c-6412-4f34-96f5-d9fc4af78c9e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.141896 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7ccf-account-create-874gd"] Oct 05 20:31:04 crc kubenswrapper[4753]: E1005 20:31:04.142286 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2fe04c-6412-4f34-96f5-d9fc4af78c9e" containerName="init" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.142303 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2fe04c-6412-4f34-96f5-d9fc4af78c9e" containerName="init" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.142465 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa2fe04c-6412-4f34-96f5-d9fc4af78c9e" containerName="init" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.142987 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ccf-account-create-874gd" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.148930 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 05 20:31:04 crc kubenswrapper[4753]: W1005 20:31:04.149413 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff380460_00c8_4b7e_b3e0_1d7fb18ed268.slice/crio-76f0a33fd0049981fb130bcb059f7da2a4eece5fdecaa34436b86c6d2f5ec5ac WatchSource:0}: Error finding container 76f0a33fd0049981fb130bcb059f7da2a4eece5fdecaa34436b86c6d2f5ec5ac: Status 404 returned error can't find the container with id 76f0a33fd0049981fb130bcb059f7da2a4eece5fdecaa34436b86c6d2f5ec5ac Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.150728 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ccf-account-create-874gd"] Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.221385 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf9c8\" (UniqueName: \"kubernetes.io/projected/189f9e06-87f8-47ee-83be-6e479c93ec63-kube-api-access-bf9c8\") pod \"cinder-7ccf-account-create-874gd\" (UID: \"189f9e06-87f8-47ee-83be-6e479c93ec63\") " pod="openstack/cinder-7ccf-account-create-874gd" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.221484 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.221497 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.224532 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b8c4-account-create-9bhss"] Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.248596 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b8c4-account-create-9bhss" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.250893 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.262898 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b8c4-account-create-9bhss"] Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.322916 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gvxm\" (UniqueName: \"kubernetes.io/projected/fef0457b-e235-4baf-8095-736b17c17fd7-kube-api-access-5gvxm\") pod \"barbican-b8c4-account-create-9bhss\" (UID: \"fef0457b-e235-4baf-8095-736b17c17fd7\") " pod="openstack/barbican-b8c4-account-create-9bhss" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.323056 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf9c8\" (UniqueName: \"kubernetes.io/projected/189f9e06-87f8-47ee-83be-6e479c93ec63-kube-api-access-bf9c8\") pod \"cinder-7ccf-account-create-874gd\" (UID: \"189f9e06-87f8-47ee-83be-6e479c93ec63\") " pod="openstack/cinder-7ccf-account-create-874gd" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.343905 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf9c8\" (UniqueName: \"kubernetes.io/projected/189f9e06-87f8-47ee-83be-6e479c93ec63-kube-api-access-bf9c8\") pod \"cinder-7ccf-account-create-874gd\" (UID: \"189f9e06-87f8-47ee-83be-6e479c93ec63\") " pod="openstack/cinder-7ccf-account-create-874gd" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.347402 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.423962 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhb4w\" (UniqueName: \"kubernetes.io/projected/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-kube-api-access-rhb4w\") pod \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.424044 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-dns-svc\") pod \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.424257 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-ovsdbserver-nb\") pod \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.424283 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-config\") pod \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.424305 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-ovsdbserver-sb\") pod \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\" (UID: \"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc\") " Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.424525 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gvxm\" (UniqueName: \"kubernetes.io/projected/fef0457b-e235-4baf-8095-736b17c17fd7-kube-api-access-5gvxm\") pod \"barbican-b8c4-account-create-9bhss\" (UID: \"fef0457b-e235-4baf-8095-736b17c17fd7\") " pod="openstack/barbican-b8c4-account-create-9bhss" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.425187 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1002-account-create-wctbs"] Oct 05 20:31:04 crc kubenswrapper[4753]: E1005 20:31:04.425566 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc" containerName="init" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.425584 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc" containerName="init" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.425750 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc" containerName="init" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.428343 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-kube-api-access-rhb4w" (OuterVolumeSpecName: "kube-api-access-rhb4w") pod "c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc" (UID: "c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc"). InnerVolumeSpecName "kube-api-access-rhb4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.428594 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1002-account-create-wctbs" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.430509 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1002-account-create-wctbs"] Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.432266 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.451990 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc" (UID: "c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.474983 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc" (UID: "c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.475651 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gvxm\" (UniqueName: \"kubernetes.io/projected/fef0457b-e235-4baf-8095-736b17c17fd7-kube-api-access-5gvxm\") pod \"barbican-b8c4-account-create-9bhss\" (UID: \"fef0457b-e235-4baf-8095-736b17c17fd7\") " pod="openstack/barbican-b8c4-account-create-9bhss" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.484268 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-config" (OuterVolumeSpecName: "config") pod "c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc" (UID: "c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.489901 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.489950 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.492581 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc" (UID: "c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.526392 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tpld\" (UniqueName: \"kubernetes.io/projected/60a64254-c591-486d-bd24-c1fc63bdb561-kube-api-access-6tpld\") pod \"neutron-1002-account-create-wctbs\" (UID: \"60a64254-c591-486d-bd24-c1fc63bdb561\") " pod="openstack/neutron-1002-account-create-wctbs" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.526668 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.526681 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.526690 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.526700 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhb4w\" (UniqueName: \"kubernetes.io/projected/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-kube-api-access-rhb4w\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.526711 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.615827 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ccf-account-create-874gd" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.626184 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b8c4-account-create-9bhss" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.627501 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tpld\" (UniqueName: \"kubernetes.io/projected/60a64254-c591-486d-bd24-c1fc63bdb561-kube-api-access-6tpld\") pod \"neutron-1002-account-create-wctbs\" (UID: \"60a64254-c591-486d-bd24-c1fc63bdb561\") " pod="openstack/neutron-1002-account-create-wctbs" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.648613 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tpld\" (UniqueName: \"kubernetes.io/projected/60a64254-c591-486d-bd24-c1fc63bdb561-kube-api-access-6tpld\") pod \"neutron-1002-account-create-wctbs\" (UID: \"60a64254-c591-486d-bd24-c1fc63bdb561\") " pod="openstack/neutron-1002-account-create-wctbs" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.734798 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" event={"ID":"c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc","Type":"ContainerDied","Data":"e4484d733b68c1fc9cba0bd31e723b327c21cd334d7b84df24b2e66f5cf8e2ce"} Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.734850 4753 scope.go:117] "RemoveContainer" containerID="718d6202771eaf57ef0a925cf96d9ca63c2b1c525f89a738d626aff922d1d676" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.734877 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f9978df-kp8gx" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.765270 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1002-account-create-wctbs" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.777407 4753 generic.go:334] "Generic (PLEG): container finished" podID="ff380460-00c8-4b7e-b3e0-1d7fb18ed268" containerID="1e5582aceea0130d0cd2cad433ca22e90d4de87aa5b710c6753dd08d8bdee954" exitCode=0 Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.777713 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" event={"ID":"ff380460-00c8-4b7e-b3e0-1d7fb18ed268","Type":"ContainerDied","Data":"1e5582aceea0130d0cd2cad433ca22e90d4de87aa5b710c6753dd08d8bdee954"} Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.777738 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" event={"ID":"ff380460-00c8-4b7e-b3e0-1d7fb18ed268","Type":"ContainerStarted","Data":"76f0a33fd0049981fb130bcb059f7da2a4eece5fdecaa34436b86c6d2f5ec5ac"} Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.783251 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" event={"ID":"aa2fe04c-6412-4f34-96f5-d9fc4af78c9e","Type":"ContainerDied","Data":"d4b81ade803bddc8f609b6df28774ef9470c382e46dc215d3239bd2293ef02ad"} Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.783307 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc86c69bc-2m5k9" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.800600 4753 scope.go:117] "RemoveContainer" containerID="a4aff760cd7f03a3a7ea819d5b80a667dd24d33d92fc65fc638a9e2b413eda2a" Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.881220 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d5f9978df-kp8gx"] Oct 05 20:31:04 crc kubenswrapper[4753]: I1005 20:31:04.891442 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d5f9978df-kp8gx"] Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.054196 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc86c69bc-2m5k9"] Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.061223 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dc86c69bc-2m5k9"] Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.192126 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b8c4-account-create-9bhss"] Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.206259 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ccf-account-create-874gd"] Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.437925 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1002-account-create-wctbs"] Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.795380 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" event={"ID":"ff380460-00c8-4b7e-b3e0-1d7fb18ed268","Type":"ContainerStarted","Data":"b70d465c01a87ed4baec0992a924024a9f770e05dbbab822b47d2b9a06e39537"} Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.796546 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.803890 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1002-account-create-wctbs" event={"ID":"60a64254-c591-486d-bd24-c1fc63bdb561","Type":"ContainerStarted","Data":"500836df6df635c1e403426c6e29d4f986a6b8355e0f7dee8a877f4e5585f365"} Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.803937 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1002-account-create-wctbs" event={"ID":"60a64254-c591-486d-bd24-c1fc63bdb561","Type":"ContainerStarted","Data":"bf4f821f57285be2c339044cb26ae985a730988cb1be116191cca7363e6e4e5e"} Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.814055 4753 generic.go:334] "Generic (PLEG): container finished" podID="fef0457b-e235-4baf-8095-736b17c17fd7" containerID="bfeaa6dc1c6c9a840673b656a702681d5c9be98c793e4a857717f0653f7406b1" exitCode=0 Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.814246 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b8c4-account-create-9bhss" event={"ID":"fef0457b-e235-4baf-8095-736b17c17fd7","Type":"ContainerDied","Data":"bfeaa6dc1c6c9a840673b656a702681d5c9be98c793e4a857717f0653f7406b1"} Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.814294 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b8c4-account-create-9bhss" event={"ID":"fef0457b-e235-4baf-8095-736b17c17fd7","Type":"ContainerStarted","Data":"ffd668c698890e06c8471ac567b591f846b5943c1dc16fb2ff5b563f25990126"} Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.818489 4753 generic.go:334] "Generic (PLEG): container finished" podID="189f9e06-87f8-47ee-83be-6e479c93ec63" containerID="3f013a731f19326bb650fe30d8ba5ae07e7dcce7f5fd634c603aa328ad9655ab" exitCode=0 Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.818540 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ccf-account-create-874gd" event={"ID":"189f9e06-87f8-47ee-83be-6e479c93ec63","Type":"ContainerDied","Data":"3f013a731f19326bb650fe30d8ba5ae07e7dcce7f5fd634c603aa328ad9655ab"} Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.818559 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ccf-account-create-874gd" event={"ID":"189f9e06-87f8-47ee-83be-6e479c93ec63","Type":"ContainerStarted","Data":"2e6a0c137fdf25d2d29f652fb42836dab999bf2e7ddbe57044cecb1e81f0433c"} Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.829531 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" podStartSLOduration=4.8295134 podStartE2EDuration="4.8295134s" podCreationTimestamp="2025-10-05 20:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:31:05.817827375 +0000 UTC m=+974.666155617" watchObservedRunningTime="2025-10-05 20:31:05.8295134 +0000 UTC m=+974.677841632" Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.862796 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-1002-account-create-wctbs" podStartSLOduration=1.86277779 podStartE2EDuration="1.86277779s" podCreationTimestamp="2025-10-05 20:31:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:31:05.860391755 +0000 UTC m=+974.708719977" watchObservedRunningTime="2025-10-05 20:31:05.86277779 +0000 UTC m=+974.711106022" Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.868310 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa2fe04c-6412-4f34-96f5-d9fc4af78c9e" path="/var/lib/kubelet/pods/aa2fe04c-6412-4f34-96f5-d9fc4af78c9e/volumes" Oct 05 20:31:05 crc kubenswrapper[4753]: I1005 20:31:05.868869 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc" path="/var/lib/kubelet/pods/c8bc3ca1-f770-4c72-9ae1-f19a40cbaadc/volumes" Oct 05 20:31:06 crc kubenswrapper[4753]: I1005 20:31:06.839924 4753 generic.go:334] "Generic (PLEG): container finished" podID="60a64254-c591-486d-bd24-c1fc63bdb561" containerID="500836df6df635c1e403426c6e29d4f986a6b8355e0f7dee8a877f4e5585f365" exitCode=0 Oct 05 20:31:06 crc kubenswrapper[4753]: I1005 20:31:06.840082 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1002-account-create-wctbs" event={"ID":"60a64254-c591-486d-bd24-c1fc63bdb561","Type":"ContainerDied","Data":"500836df6df635c1e403426c6e29d4f986a6b8355e0f7dee8a877f4e5585f365"} Oct 05 20:31:07 crc kubenswrapper[4753]: I1005 20:31:07.874846 4753 generic.go:334] "Generic (PLEG): container finished" podID="7db01ca6-f0c9-4f53-888a-dc2c806310cd" containerID="6e63058d2cb40f0e763d5fab64e729403984404fc3d63a216d05106ab19053ac" exitCode=0 Oct 05 20:31:07 crc kubenswrapper[4753]: I1005 20:31:07.874916 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vt5gp" event={"ID":"7db01ca6-f0c9-4f53-888a-dc2c806310cd","Type":"ContainerDied","Data":"6e63058d2cb40f0e763d5fab64e729403984404fc3d63a216d05106ab19053ac"} Oct 05 20:31:08 crc kubenswrapper[4753]: I1005 20:31:08.018923 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ccf-account-create-874gd" Oct 05 20:31:08 crc kubenswrapper[4753]: I1005 20:31:08.110643 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf9c8\" (UniqueName: \"kubernetes.io/projected/189f9e06-87f8-47ee-83be-6e479c93ec63-kube-api-access-bf9c8\") pod \"189f9e06-87f8-47ee-83be-6e479c93ec63\" (UID: \"189f9e06-87f8-47ee-83be-6e479c93ec63\") " Oct 05 20:31:08 crc kubenswrapper[4753]: I1005 20:31:08.116345 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189f9e06-87f8-47ee-83be-6e479c93ec63-kube-api-access-bf9c8" (OuterVolumeSpecName: "kube-api-access-bf9c8") pod "189f9e06-87f8-47ee-83be-6e479c93ec63" (UID: "189f9e06-87f8-47ee-83be-6e479c93ec63"). InnerVolumeSpecName "kube-api-access-bf9c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:31:08 crc kubenswrapper[4753]: I1005 20:31:08.213086 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf9c8\" (UniqueName: \"kubernetes.io/projected/189f9e06-87f8-47ee-83be-6e479c93ec63-kube-api-access-bf9c8\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:08 crc kubenswrapper[4753]: I1005 20:31:08.883299 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ccf-account-create-874gd" event={"ID":"189f9e06-87f8-47ee-83be-6e479c93ec63","Type":"ContainerDied","Data":"2e6a0c137fdf25d2d29f652fb42836dab999bf2e7ddbe57044cecb1e81f0433c"} Oct 05 20:31:08 crc kubenswrapper[4753]: I1005 20:31:08.883348 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e6a0c137fdf25d2d29f652fb42836dab999bf2e7ddbe57044cecb1e81f0433c" Oct 05 20:31:08 crc kubenswrapper[4753]: I1005 20:31:08.883421 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ccf-account-create-874gd" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.563368 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-tnf7s"] Oct 05 20:31:09 crc kubenswrapper[4753]: E1005 20:31:09.563855 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189f9e06-87f8-47ee-83be-6e479c93ec63" containerName="mariadb-account-create" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.563880 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="189f9e06-87f8-47ee-83be-6e479c93ec63" containerName="mariadb-account-create" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.564117 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="189f9e06-87f8-47ee-83be-6e479c93ec63" containerName="mariadb-account-create" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.564803 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.573363 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tnf7s"] Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.574448 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-92d8j" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.574890 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.575013 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.738924 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-config-data\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.738983 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-scripts\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.739105 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3841483-2af9-40d8-8197-531d8dd1e57f-etc-machine-id\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.739218 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mj2v\" (UniqueName: \"kubernetes.io/projected/e3841483-2af9-40d8-8197-531d8dd1e57f-kube-api-access-8mj2v\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.739255 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-combined-ca-bundle\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.739359 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-db-sync-config-data\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.840518 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mj2v\" (UniqueName: \"kubernetes.io/projected/e3841483-2af9-40d8-8197-531d8dd1e57f-kube-api-access-8mj2v\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.840572 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-combined-ca-bundle\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.840620 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-db-sync-config-data\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.840645 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-config-data\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.840673 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-scripts\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.840715 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3841483-2af9-40d8-8197-531d8dd1e57f-etc-machine-id\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.840793 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3841483-2af9-40d8-8197-531d8dd1e57f-etc-machine-id\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.847327 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-config-data\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.853929 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-db-sync-config-data\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.854817 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-scripts\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.855634 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-combined-ca-bundle\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.860331 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mj2v\" (UniqueName: \"kubernetes.io/projected/e3841483-2af9-40d8-8197-531d8dd1e57f-kube-api-access-8mj2v\") pod \"cinder-db-sync-tnf7s\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:09 crc kubenswrapper[4753]: I1005 20:31:09.882380 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.412695 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.451355 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.466058 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8454ffc489-w9l78"] Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.466293 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8454ffc489-w9l78" podUID="ff73c79b-7168-4596-b74c-136ff3bfff2f" containerName="dnsmasq-dns" containerID="cri-o://e5a1bcd610b0456cf97746c2c7f514702f64262ca03ee94df46bef50b2bb7436" gracePeriod=10 Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.485265 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b8c4-account-create-9bhss" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.534437 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1002-account-create-wctbs" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.589254 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-credential-keys\") pod \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.589437 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-fernet-keys\") pod \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.589492 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lx2v\" (UniqueName: \"kubernetes.io/projected/7db01ca6-f0c9-4f53-888a-dc2c806310cd-kube-api-access-4lx2v\") pod \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.589602 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gvxm\" (UniqueName: \"kubernetes.io/projected/fef0457b-e235-4baf-8095-736b17c17fd7-kube-api-access-5gvxm\") pod \"fef0457b-e235-4baf-8095-736b17c17fd7\" (UID: \"fef0457b-e235-4baf-8095-736b17c17fd7\") " Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.590305 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-combined-ca-bundle\") pod \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.590365 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-scripts\") pod \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.590400 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-config-data\") pod \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\" (UID: \"7db01ca6-f0c9-4f53-888a-dc2c806310cd\") " Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.594223 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7db01ca6-f0c9-4f53-888a-dc2c806310cd" (UID: "7db01ca6-f0c9-4f53-888a-dc2c806310cd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.596040 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7db01ca6-f0c9-4f53-888a-dc2c806310cd-kube-api-access-4lx2v" (OuterVolumeSpecName: "kube-api-access-4lx2v") pod "7db01ca6-f0c9-4f53-888a-dc2c806310cd" (UID: "7db01ca6-f0c9-4f53-888a-dc2c806310cd"). InnerVolumeSpecName "kube-api-access-4lx2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.598182 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fef0457b-e235-4baf-8095-736b17c17fd7-kube-api-access-5gvxm" (OuterVolumeSpecName: "kube-api-access-5gvxm") pod "fef0457b-e235-4baf-8095-736b17c17fd7" (UID: "fef0457b-e235-4baf-8095-736b17c17fd7"). InnerVolumeSpecName "kube-api-access-5gvxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.600169 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7db01ca6-f0c9-4f53-888a-dc2c806310cd" (UID: "7db01ca6-f0c9-4f53-888a-dc2c806310cd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.600236 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-scripts" (OuterVolumeSpecName: "scripts") pod "7db01ca6-f0c9-4f53-888a-dc2c806310cd" (UID: "7db01ca6-f0c9-4f53-888a-dc2c806310cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.629355 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7db01ca6-f0c9-4f53-888a-dc2c806310cd" (UID: "7db01ca6-f0c9-4f53-888a-dc2c806310cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.672528 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-config-data" (OuterVolumeSpecName: "config-data") pod "7db01ca6-f0c9-4f53-888a-dc2c806310cd" (UID: "7db01ca6-f0c9-4f53-888a-dc2c806310cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.694169 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tpld\" (UniqueName: \"kubernetes.io/projected/60a64254-c591-486d-bd24-c1fc63bdb561-kube-api-access-6tpld\") pod \"60a64254-c591-486d-bd24-c1fc63bdb561\" (UID: \"60a64254-c591-486d-bd24-c1fc63bdb561\") " Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.694511 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.694524 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.694533 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.694540 4753 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.694548 4753 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7db01ca6-f0c9-4f53-888a-dc2c806310cd-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.694556 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lx2v\" (UniqueName: \"kubernetes.io/projected/7db01ca6-f0c9-4f53-888a-dc2c806310cd-kube-api-access-4lx2v\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.694566 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gvxm\" (UniqueName: \"kubernetes.io/projected/fef0457b-e235-4baf-8095-736b17c17fd7-kube-api-access-5gvxm\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.697064 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60a64254-c591-486d-bd24-c1fc63bdb561-kube-api-access-6tpld" (OuterVolumeSpecName: "kube-api-access-6tpld") pod "60a64254-c591-486d-bd24-c1fc63bdb561" (UID: "60a64254-c591-486d-bd24-c1fc63bdb561"). InnerVolumeSpecName "kube-api-access-6tpld". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.726702 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tnf7s"] Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.796045 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tpld\" (UniqueName: \"kubernetes.io/projected/60a64254-c591-486d-bd24-c1fc63bdb561-kube-api-access-6tpld\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.874419 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.934541 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1002-account-create-wctbs" event={"ID":"60a64254-c591-486d-bd24-c1fc63bdb561","Type":"ContainerDied","Data":"bf4f821f57285be2c339044cb26ae985a730988cb1be116191cca7363e6e4e5e"} Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.934838 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf4f821f57285be2c339044cb26ae985a730988cb1be116191cca7363e6e4e5e" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.934888 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1002-account-create-wctbs" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.938443 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tnf7s" event={"ID":"e3841483-2af9-40d8-8197-531d8dd1e57f","Type":"ContainerStarted","Data":"0f14a13b63704478d02f231e152e69837d90cafb0b089ae01ef904d6a520f7af"} Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.939469 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b8c4-account-create-9bhss" event={"ID":"fef0457b-e235-4baf-8095-736b17c17fd7","Type":"ContainerDied","Data":"ffd668c698890e06c8471ac567b591f846b5943c1dc16fb2ff5b563f25990126"} Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.939491 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffd668c698890e06c8471ac567b591f846b5943c1dc16fb2ff5b563f25990126" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.939532 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b8c4-account-create-9bhss" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.949499 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-flzmg" event={"ID":"3154b1ca-ea81-4fc8-ba7d-ff439e97c930","Type":"ContainerStarted","Data":"1f65b40afdeba568afe9de88bd091a6b6b710eede63ee93a09fb76c30c3a1b29"} Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.951411 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de197a05-86cb-4de6-b4f7-03e27dba1e02","Type":"ContainerStarted","Data":"44a4f4ab27768bfc64dea4cc8500bdf65997de078dc781cdb323b3b8ce0b78f0"} Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.953385 4753 generic.go:334] "Generic (PLEG): container finished" podID="ff73c79b-7168-4596-b74c-136ff3bfff2f" containerID="e5a1bcd610b0456cf97746c2c7f514702f64262ca03ee94df46bef50b2bb7436" exitCode=0 Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.953428 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8454ffc489-w9l78" event={"ID":"ff73c79b-7168-4596-b74c-136ff3bfff2f","Type":"ContainerDied","Data":"e5a1bcd610b0456cf97746c2c7f514702f64262ca03ee94df46bef50b2bb7436"} Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.953446 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8454ffc489-w9l78" event={"ID":"ff73c79b-7168-4596-b74c-136ff3bfff2f","Type":"ContainerDied","Data":"6bddacf8424aee76c6b03c21ec58520f325cd5f75f9fdce84847f7a4e59996a9"} Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.953461 4753 scope.go:117] "RemoveContainer" containerID="e5a1bcd610b0456cf97746c2c7f514702f64262ca03ee94df46bef50b2bb7436" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.953546 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8454ffc489-w9l78" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.957857 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vt5gp" event={"ID":"7db01ca6-f0c9-4f53-888a-dc2c806310cd","Type":"ContainerDied","Data":"fb8b1419d91b36c2d9e788e615ff1dcb5f29282ffebeb5251c24154548bdc25f"} Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.957892 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb8b1419d91b36c2d9e788e615ff1dcb5f29282ffebeb5251c24154548bdc25f" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.957948 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vt5gp" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.965603 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-flzmg" podStartSLOduration=2.144087891 podStartE2EDuration="11.965525481s" podCreationTimestamp="2025-10-05 20:31:01 +0000 UTC" firstStartedPulling="2025-10-05 20:31:02.437865102 +0000 UTC m=+971.286193334" lastFinishedPulling="2025-10-05 20:31:12.259302702 +0000 UTC m=+981.107630924" observedRunningTime="2025-10-05 20:31:12.963629772 +0000 UTC m=+981.811958014" watchObservedRunningTime="2025-10-05 20:31:12.965525481 +0000 UTC m=+981.813853713" Oct 05 20:31:12 crc kubenswrapper[4753]: I1005 20:31:12.991342 4753 scope.go:117] "RemoveContainer" containerID="6d0ab7710a68faf6a81708df96132f3f2485902d834accaa608d5eea2bf40135" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:12.999208 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-ovsdbserver-sb\") pod \"ff73c79b-7168-4596-b74c-136ff3bfff2f\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:12.999368 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-config\") pod \"ff73c79b-7168-4596-b74c-136ff3bfff2f\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:12.999438 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-656dx\" (UniqueName: \"kubernetes.io/projected/ff73c79b-7168-4596-b74c-136ff3bfff2f-kube-api-access-656dx\") pod \"ff73c79b-7168-4596-b74c-136ff3bfff2f\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:12.999485 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-ovsdbserver-nb\") pod \"ff73c79b-7168-4596-b74c-136ff3bfff2f\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:12.999533 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-dns-svc\") pod \"ff73c79b-7168-4596-b74c-136ff3bfff2f\" (UID: \"ff73c79b-7168-4596-b74c-136ff3bfff2f\") " Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.006985 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff73c79b-7168-4596-b74c-136ff3bfff2f-kube-api-access-656dx" (OuterVolumeSpecName: "kube-api-access-656dx") pod "ff73c79b-7168-4596-b74c-136ff3bfff2f" (UID: "ff73c79b-7168-4596-b74c-136ff3bfff2f"). InnerVolumeSpecName "kube-api-access-656dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.033997 4753 scope.go:117] "RemoveContainer" containerID="e5a1bcd610b0456cf97746c2c7f514702f64262ca03ee94df46bef50b2bb7436" Oct 05 20:31:13 crc kubenswrapper[4753]: E1005 20:31:13.034629 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a1bcd610b0456cf97746c2c7f514702f64262ca03ee94df46bef50b2bb7436\": container with ID starting with e5a1bcd610b0456cf97746c2c7f514702f64262ca03ee94df46bef50b2bb7436 not found: ID does not exist" containerID="e5a1bcd610b0456cf97746c2c7f514702f64262ca03ee94df46bef50b2bb7436" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.034672 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a1bcd610b0456cf97746c2c7f514702f64262ca03ee94df46bef50b2bb7436"} err="failed to get container status \"e5a1bcd610b0456cf97746c2c7f514702f64262ca03ee94df46bef50b2bb7436\": rpc error: code = NotFound desc = could not find container \"e5a1bcd610b0456cf97746c2c7f514702f64262ca03ee94df46bef50b2bb7436\": container with ID starting with e5a1bcd610b0456cf97746c2c7f514702f64262ca03ee94df46bef50b2bb7436 not found: ID does not exist" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.034697 4753 scope.go:117] "RemoveContainer" containerID="6d0ab7710a68faf6a81708df96132f3f2485902d834accaa608d5eea2bf40135" Oct 05 20:31:13 crc kubenswrapper[4753]: E1005 20:31:13.034930 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d0ab7710a68faf6a81708df96132f3f2485902d834accaa608d5eea2bf40135\": container with ID starting with 6d0ab7710a68faf6a81708df96132f3f2485902d834accaa608d5eea2bf40135 not found: ID does not exist" containerID="6d0ab7710a68faf6a81708df96132f3f2485902d834accaa608d5eea2bf40135" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.034948 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0ab7710a68faf6a81708df96132f3f2485902d834accaa608d5eea2bf40135"} err="failed to get container status \"6d0ab7710a68faf6a81708df96132f3f2485902d834accaa608d5eea2bf40135\": rpc error: code = NotFound desc = could not find container \"6d0ab7710a68faf6a81708df96132f3f2485902d834accaa608d5eea2bf40135\": container with ID starting with 6d0ab7710a68faf6a81708df96132f3f2485902d834accaa608d5eea2bf40135 not found: ID does not exist" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.043921 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff73c79b-7168-4596-b74c-136ff3bfff2f" (UID: "ff73c79b-7168-4596-b74c-136ff3bfff2f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.046405 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-config" (OuterVolumeSpecName: "config") pod "ff73c79b-7168-4596-b74c-136ff3bfff2f" (UID: "ff73c79b-7168-4596-b74c-136ff3bfff2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.051882 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff73c79b-7168-4596-b74c-136ff3bfff2f" (UID: "ff73c79b-7168-4596-b74c-136ff3bfff2f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.057247 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff73c79b-7168-4596-b74c-136ff3bfff2f" (UID: "ff73c79b-7168-4596-b74c-136ff3bfff2f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.103270 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.103301 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.103311 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.103321 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff73c79b-7168-4596-b74c-136ff3bfff2f-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.103332 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-656dx\" (UniqueName: \"kubernetes.io/projected/ff73c79b-7168-4596-b74c-136ff3bfff2f-kube-api-access-656dx\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.290365 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8454ffc489-w9l78"] Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.298027 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8454ffc489-w9l78"] Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.574535 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vt5gp"] Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.583104 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vt5gp"] Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.633884 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7btxb"] Oct 05 20:31:13 crc kubenswrapper[4753]: E1005 20:31:13.634231 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fef0457b-e235-4baf-8095-736b17c17fd7" containerName="mariadb-account-create" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.634248 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fef0457b-e235-4baf-8095-736b17c17fd7" containerName="mariadb-account-create" Oct 05 20:31:13 crc kubenswrapper[4753]: E1005 20:31:13.634266 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff73c79b-7168-4596-b74c-136ff3bfff2f" containerName="init" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.634272 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff73c79b-7168-4596-b74c-136ff3bfff2f" containerName="init" Oct 05 20:31:13 crc kubenswrapper[4753]: E1005 20:31:13.634284 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff73c79b-7168-4596-b74c-136ff3bfff2f" containerName="dnsmasq-dns" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.634290 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff73c79b-7168-4596-b74c-136ff3bfff2f" containerName="dnsmasq-dns" Oct 05 20:31:13 crc kubenswrapper[4753]: E1005 20:31:13.634309 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a64254-c591-486d-bd24-c1fc63bdb561" containerName="mariadb-account-create" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.634315 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a64254-c591-486d-bd24-c1fc63bdb561" containerName="mariadb-account-create" Oct 05 20:31:13 crc kubenswrapper[4753]: E1005 20:31:13.634322 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db01ca6-f0c9-4f53-888a-dc2c806310cd" containerName="keystone-bootstrap" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.634329 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db01ca6-f0c9-4f53-888a-dc2c806310cd" containerName="keystone-bootstrap" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.634463 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff73c79b-7168-4596-b74c-136ff3bfff2f" containerName="dnsmasq-dns" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.634475 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a64254-c591-486d-bd24-c1fc63bdb561" containerName="mariadb-account-create" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.634492 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fef0457b-e235-4baf-8095-736b17c17fd7" containerName="mariadb-account-create" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.634503 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db01ca6-f0c9-4f53-888a-dc2c806310cd" containerName="keystone-bootstrap" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.634983 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.636811 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.637017 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.638714 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2mmgx" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.654222 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7btxb"] Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.656630 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.710476 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-config-data\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.710546 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-fernet-keys\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.710592 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-scripts\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.710615 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-combined-ca-bundle\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.710793 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d56r2\" (UniqueName: \"kubernetes.io/projected/04bd2ecd-c468-4efa-877b-983c27dde353-kube-api-access-d56r2\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.710836 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-credential-keys\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.812535 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-fernet-keys\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.812590 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-scripts\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.812615 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-combined-ca-bundle\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.812719 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d56r2\" (UniqueName: \"kubernetes.io/projected/04bd2ecd-c468-4efa-877b-983c27dde353-kube-api-access-d56r2\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.812736 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-credential-keys\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.812768 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-config-data\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.818477 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-config-data\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.831699 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-fernet-keys\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.832352 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-combined-ca-bundle\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.833881 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-credential-keys\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.839701 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d56r2\" (UniqueName: \"kubernetes.io/projected/04bd2ecd-c468-4efa-877b-983c27dde353-kube-api-access-d56r2\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.844388 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-scripts\") pod \"keystone-bootstrap-7btxb\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.861688 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7db01ca6-f0c9-4f53-888a-dc2c806310cd" path="/var/lib/kubelet/pods/7db01ca6-f0c9-4f53-888a-dc2c806310cd/volumes" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.862451 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff73c79b-7168-4596-b74c-136ff3bfff2f" path="/var/lib/kubelet/pods/ff73c79b-7168-4596-b74c-136ff3bfff2f/volumes" Oct 05 20:31:13 crc kubenswrapper[4753]: I1005 20:31:13.951613 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.418530 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7btxb"] Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.759661 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-b97mg"] Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.760747 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b97mg" Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.763612 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.766429 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f896l" Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.776092 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-b97mg"] Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.836762 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fa63f817-249e-416d-aa21-47fe6e04180c-db-sync-config-data\") pod \"barbican-db-sync-b97mg\" (UID: \"fa63f817-249e-416d-aa21-47fe6e04180c\") " pod="openstack/barbican-db-sync-b97mg" Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.836859 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j74jc\" (UniqueName: \"kubernetes.io/projected/fa63f817-249e-416d-aa21-47fe6e04180c-kube-api-access-j74jc\") pod \"barbican-db-sync-b97mg\" (UID: \"fa63f817-249e-416d-aa21-47fe6e04180c\") " pod="openstack/barbican-db-sync-b97mg" Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.836911 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa63f817-249e-416d-aa21-47fe6e04180c-combined-ca-bundle\") pod \"barbican-db-sync-b97mg\" (UID: \"fa63f817-249e-416d-aa21-47fe6e04180c\") " pod="openstack/barbican-db-sync-b97mg" Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.874455 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-w27gb"] Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.875833 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w27gb" Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.878682 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.880278 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-w27gb"] Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.881980 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vwlnw" Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.882472 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.937705 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fa63f817-249e-416d-aa21-47fe6e04180c-db-sync-config-data\") pod \"barbican-db-sync-b97mg\" (UID: \"fa63f817-249e-416d-aa21-47fe6e04180c\") " pod="openstack/barbican-db-sync-b97mg" Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.937790 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j74jc\" (UniqueName: \"kubernetes.io/projected/fa63f817-249e-416d-aa21-47fe6e04180c-kube-api-access-j74jc\") pod \"barbican-db-sync-b97mg\" (UID: \"fa63f817-249e-416d-aa21-47fe6e04180c\") " pod="openstack/barbican-db-sync-b97mg" Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.937835 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa63f817-249e-416d-aa21-47fe6e04180c-combined-ca-bundle\") pod \"barbican-db-sync-b97mg\" (UID: \"fa63f817-249e-416d-aa21-47fe6e04180c\") " pod="openstack/barbican-db-sync-b97mg" Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.942637 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa63f817-249e-416d-aa21-47fe6e04180c-combined-ca-bundle\") pod \"barbican-db-sync-b97mg\" (UID: \"fa63f817-249e-416d-aa21-47fe6e04180c\") " pod="openstack/barbican-db-sync-b97mg" Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.950950 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fa63f817-249e-416d-aa21-47fe6e04180c-db-sync-config-data\") pod \"barbican-db-sync-b97mg\" (UID: \"fa63f817-249e-416d-aa21-47fe6e04180c\") " pod="openstack/barbican-db-sync-b97mg" Oct 05 20:31:14 crc kubenswrapper[4753]: I1005 20:31:14.958099 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j74jc\" (UniqueName: \"kubernetes.io/projected/fa63f817-249e-416d-aa21-47fe6e04180c-kube-api-access-j74jc\") pod \"barbican-db-sync-b97mg\" (UID: \"fa63f817-249e-416d-aa21-47fe6e04180c\") " pod="openstack/barbican-db-sync-b97mg" Oct 05 20:31:15 crc kubenswrapper[4753]: I1005 20:31:15.013923 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7btxb" event={"ID":"04bd2ecd-c468-4efa-877b-983c27dde353","Type":"ContainerStarted","Data":"a57f5bec87864d6cb8f5081178714ac05a4d7a55ab7f11e72278a751202c2bce"} Oct 05 20:31:15 crc kubenswrapper[4753]: I1005 20:31:15.039644 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c440ae46-1143-44a6-8443-1b4c27fda1d1-config\") pod \"neutron-db-sync-w27gb\" (UID: \"c440ae46-1143-44a6-8443-1b4c27fda1d1\") " pod="openstack/neutron-db-sync-w27gb" Oct 05 20:31:15 crc kubenswrapper[4753]: I1005 20:31:15.039741 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf2xr\" (UniqueName: \"kubernetes.io/projected/c440ae46-1143-44a6-8443-1b4c27fda1d1-kube-api-access-bf2xr\") pod \"neutron-db-sync-w27gb\" (UID: \"c440ae46-1143-44a6-8443-1b4c27fda1d1\") " pod="openstack/neutron-db-sync-w27gb" Oct 05 20:31:15 crc kubenswrapper[4753]: I1005 20:31:15.039910 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c440ae46-1143-44a6-8443-1b4c27fda1d1-combined-ca-bundle\") pod \"neutron-db-sync-w27gb\" (UID: \"c440ae46-1143-44a6-8443-1b4c27fda1d1\") " pod="openstack/neutron-db-sync-w27gb" Oct 05 20:31:15 crc kubenswrapper[4753]: I1005 20:31:15.094514 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b97mg" Oct 05 20:31:15 crc kubenswrapper[4753]: I1005 20:31:15.141306 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c440ae46-1143-44a6-8443-1b4c27fda1d1-combined-ca-bundle\") pod \"neutron-db-sync-w27gb\" (UID: \"c440ae46-1143-44a6-8443-1b4c27fda1d1\") " pod="openstack/neutron-db-sync-w27gb" Oct 05 20:31:15 crc kubenswrapper[4753]: I1005 20:31:15.141419 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c440ae46-1143-44a6-8443-1b4c27fda1d1-config\") pod \"neutron-db-sync-w27gb\" (UID: \"c440ae46-1143-44a6-8443-1b4c27fda1d1\") " pod="openstack/neutron-db-sync-w27gb" Oct 05 20:31:15 crc kubenswrapper[4753]: I1005 20:31:15.141455 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf2xr\" (UniqueName: \"kubernetes.io/projected/c440ae46-1143-44a6-8443-1b4c27fda1d1-kube-api-access-bf2xr\") pod \"neutron-db-sync-w27gb\" (UID: \"c440ae46-1143-44a6-8443-1b4c27fda1d1\") " pod="openstack/neutron-db-sync-w27gb" Oct 05 20:31:15 crc kubenswrapper[4753]: I1005 20:31:15.149131 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c440ae46-1143-44a6-8443-1b4c27fda1d1-config\") pod \"neutron-db-sync-w27gb\" (UID: \"c440ae46-1143-44a6-8443-1b4c27fda1d1\") " pod="openstack/neutron-db-sync-w27gb" Oct 05 20:31:15 crc kubenswrapper[4753]: I1005 20:31:15.149444 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c440ae46-1143-44a6-8443-1b4c27fda1d1-combined-ca-bundle\") pod \"neutron-db-sync-w27gb\" (UID: \"c440ae46-1143-44a6-8443-1b4c27fda1d1\") " pod="openstack/neutron-db-sync-w27gb" Oct 05 20:31:15 crc kubenswrapper[4753]: I1005 20:31:15.159440 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf2xr\" (UniqueName: \"kubernetes.io/projected/c440ae46-1143-44a6-8443-1b4c27fda1d1-kube-api-access-bf2xr\") pod \"neutron-db-sync-w27gb\" (UID: \"c440ae46-1143-44a6-8443-1b4c27fda1d1\") " pod="openstack/neutron-db-sync-w27gb" Oct 05 20:31:15 crc kubenswrapper[4753]: I1005 20:31:15.237351 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w27gb" Oct 05 20:31:15 crc kubenswrapper[4753]: I1005 20:31:15.548408 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-b97mg"] Oct 05 20:31:15 crc kubenswrapper[4753]: W1005 20:31:15.558282 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa63f817_249e_416d_aa21_47fe6e04180c.slice/crio-e0f161dcbf71def7995528f83abecfc245f326f7292afda86e9647772f4ffeab WatchSource:0}: Error finding container e0f161dcbf71def7995528f83abecfc245f326f7292afda86e9647772f4ffeab: Status 404 returned error can't find the container with id e0f161dcbf71def7995528f83abecfc245f326f7292afda86e9647772f4ffeab Oct 05 20:31:15 crc kubenswrapper[4753]: I1005 20:31:15.719592 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-w27gb"] Oct 05 20:31:15 crc kubenswrapper[4753]: W1005 20:31:15.724778 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc440ae46_1143_44a6_8443_1b4c27fda1d1.slice/crio-132dde30d8726696f05aa9028d4dd4c5eba5af3e5a071b84098ecfce815d5037 WatchSource:0}: Error finding container 132dde30d8726696f05aa9028d4dd4c5eba5af3e5a071b84098ecfce815d5037: Status 404 returned error can't find the container with id 132dde30d8726696f05aa9028d4dd4c5eba5af3e5a071b84098ecfce815d5037 Oct 05 20:31:16 crc kubenswrapper[4753]: I1005 20:31:16.025727 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w27gb" event={"ID":"c440ae46-1143-44a6-8443-1b4c27fda1d1","Type":"ContainerStarted","Data":"43014cc60553b045db9e915b5f2ec0958cf6323fcd52f2246bbd6f261ebe0483"} Oct 05 20:31:16 crc kubenswrapper[4753]: I1005 20:31:16.025767 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w27gb" event={"ID":"c440ae46-1143-44a6-8443-1b4c27fda1d1","Type":"ContainerStarted","Data":"132dde30d8726696f05aa9028d4dd4c5eba5af3e5a071b84098ecfce815d5037"} Oct 05 20:31:16 crc kubenswrapper[4753]: I1005 20:31:16.029447 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b97mg" event={"ID":"fa63f817-249e-416d-aa21-47fe6e04180c","Type":"ContainerStarted","Data":"e0f161dcbf71def7995528f83abecfc245f326f7292afda86e9647772f4ffeab"} Oct 05 20:31:16 crc kubenswrapper[4753]: I1005 20:31:16.031124 4753 generic.go:334] "Generic (PLEG): container finished" podID="3154b1ca-ea81-4fc8-ba7d-ff439e97c930" containerID="1f65b40afdeba568afe9de88bd091a6b6b710eede63ee93a09fb76c30c3a1b29" exitCode=0 Oct 05 20:31:16 crc kubenswrapper[4753]: I1005 20:31:16.031194 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-flzmg" event={"ID":"3154b1ca-ea81-4fc8-ba7d-ff439e97c930","Type":"ContainerDied","Data":"1f65b40afdeba568afe9de88bd091a6b6b710eede63ee93a09fb76c30c3a1b29"} Oct 05 20:31:16 crc kubenswrapper[4753]: I1005 20:31:16.033581 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de197a05-86cb-4de6-b4f7-03e27dba1e02","Type":"ContainerStarted","Data":"db0ba4526062640baf7519ca063ea638d3449834d30bf58803214c5acf5dc485"} Oct 05 20:31:16 crc kubenswrapper[4753]: I1005 20:31:16.036915 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7btxb" event={"ID":"04bd2ecd-c468-4efa-877b-983c27dde353","Type":"ContainerStarted","Data":"3bd2a438f86333ebfd71c5c63d94276e3cb4fe2084ff7a5c42beaabddbd3988e"} Oct 05 20:31:16 crc kubenswrapper[4753]: I1005 20:31:16.045269 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-w27gb" podStartSLOduration=2.045252903 podStartE2EDuration="2.045252903s" podCreationTimestamp="2025-10-05 20:31:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:31:16.043095325 +0000 UTC m=+984.891423557" watchObservedRunningTime="2025-10-05 20:31:16.045252903 +0000 UTC m=+984.893581135" Oct 05 20:31:16 crc kubenswrapper[4753]: I1005 20:31:16.061072 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7btxb" podStartSLOduration=3.061055287 podStartE2EDuration="3.061055287s" podCreationTimestamp="2025-10-05 20:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:31:16.058076644 +0000 UTC m=+984.906404876" watchObservedRunningTime="2025-10-05 20:31:16.061055287 +0000 UTC m=+984.909383519" Oct 05 20:31:17 crc kubenswrapper[4753]: I1005 20:31:17.467066 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:17 crc kubenswrapper[4753]: I1005 20:31:17.584626 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-combined-ca-bundle\") pod \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " Oct 05 20:31:17 crc kubenswrapper[4753]: I1005 20:31:17.584760 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-logs\") pod \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " Oct 05 20:31:17 crc kubenswrapper[4753]: I1005 20:31:17.584789 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-scripts\") pod \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " Oct 05 20:31:17 crc kubenswrapper[4753]: I1005 20:31:17.584826 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-config-data\") pod \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " Oct 05 20:31:17 crc kubenswrapper[4753]: I1005 20:31:17.584844 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr5bq\" (UniqueName: \"kubernetes.io/projected/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-kube-api-access-dr5bq\") pod \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\" (UID: \"3154b1ca-ea81-4fc8-ba7d-ff439e97c930\") " Oct 05 20:31:17 crc kubenswrapper[4753]: I1005 20:31:17.585086 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-logs" (OuterVolumeSpecName: "logs") pod "3154b1ca-ea81-4fc8-ba7d-ff439e97c930" (UID: "3154b1ca-ea81-4fc8-ba7d-ff439e97c930"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:31:17 crc kubenswrapper[4753]: I1005 20:31:17.590949 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-scripts" (OuterVolumeSpecName: "scripts") pod "3154b1ca-ea81-4fc8-ba7d-ff439e97c930" (UID: "3154b1ca-ea81-4fc8-ba7d-ff439e97c930"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:17 crc kubenswrapper[4753]: I1005 20:31:17.596616 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-kube-api-access-dr5bq" (OuterVolumeSpecName: "kube-api-access-dr5bq") pod "3154b1ca-ea81-4fc8-ba7d-ff439e97c930" (UID: "3154b1ca-ea81-4fc8-ba7d-ff439e97c930"). InnerVolumeSpecName "kube-api-access-dr5bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:31:17 crc kubenswrapper[4753]: I1005 20:31:17.613565 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-config-data" (OuterVolumeSpecName: "config-data") pod "3154b1ca-ea81-4fc8-ba7d-ff439e97c930" (UID: "3154b1ca-ea81-4fc8-ba7d-ff439e97c930"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:17 crc kubenswrapper[4753]: I1005 20:31:17.647958 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3154b1ca-ea81-4fc8-ba7d-ff439e97c930" (UID: "3154b1ca-ea81-4fc8-ba7d-ff439e97c930"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:17 crc kubenswrapper[4753]: I1005 20:31:17.691698 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:17 crc kubenswrapper[4753]: I1005 20:31:17.691735 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-logs\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:17 crc kubenswrapper[4753]: I1005 20:31:17.691744 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:17 crc kubenswrapper[4753]: I1005 20:31:17.691752 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:17 crc kubenswrapper[4753]: I1005 20:31:17.691760 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr5bq\" (UniqueName: \"kubernetes.io/projected/3154b1ca-ea81-4fc8-ba7d-ff439e97c930-kube-api-access-dr5bq\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.074059 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-flzmg" event={"ID":"3154b1ca-ea81-4fc8-ba7d-ff439e97c930","Type":"ContainerDied","Data":"7b00f5a821bfc58f9a4cbe137fbbd1b1c6aa8370b0bc24584658bd4a34460ba8"} Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.074364 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b00f5a821bfc58f9a4cbe137fbbd1b1c6aa8370b0bc24584658bd4a34460ba8" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.074419 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-flzmg" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.188413 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-78454fb4-ktvqp"] Oct 05 20:31:18 crc kubenswrapper[4753]: E1005 20:31:18.188801 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3154b1ca-ea81-4fc8-ba7d-ff439e97c930" containerName="placement-db-sync" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.188819 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3154b1ca-ea81-4fc8-ba7d-ff439e97c930" containerName="placement-db-sync" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.189011 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="3154b1ca-ea81-4fc8-ba7d-ff439e97c930" containerName="placement-db-sync" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.189842 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.196959 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78454fb4-ktvqp"] Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.201422 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.201610 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.202302 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9m55q" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.202349 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.207328 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.303964 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/131ce515-ac42-4446-b075-5e50254e6686-logs\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.304020 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/131ce515-ac42-4446-b075-5e50254e6686-internal-tls-certs\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.304121 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn6tx\" (UniqueName: \"kubernetes.io/projected/131ce515-ac42-4446-b075-5e50254e6686-kube-api-access-rn6tx\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.304266 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/131ce515-ac42-4446-b075-5e50254e6686-public-tls-certs\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.304293 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ce515-ac42-4446-b075-5e50254e6686-combined-ca-bundle\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.304317 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/131ce515-ac42-4446-b075-5e50254e6686-scripts\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.304332 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131ce515-ac42-4446-b075-5e50254e6686-config-data\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.405931 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/131ce515-ac42-4446-b075-5e50254e6686-public-tls-certs\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.405989 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ce515-ac42-4446-b075-5e50254e6686-combined-ca-bundle\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.406032 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/131ce515-ac42-4446-b075-5e50254e6686-scripts\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.406053 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131ce515-ac42-4446-b075-5e50254e6686-config-data\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.406119 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/131ce515-ac42-4446-b075-5e50254e6686-logs\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.406163 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/131ce515-ac42-4446-b075-5e50254e6686-internal-tls-certs\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.406196 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn6tx\" (UniqueName: \"kubernetes.io/projected/131ce515-ac42-4446-b075-5e50254e6686-kube-api-access-rn6tx\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.406795 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/131ce515-ac42-4446-b075-5e50254e6686-logs\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.424170 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/131ce515-ac42-4446-b075-5e50254e6686-scripts\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.425257 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131ce515-ac42-4446-b075-5e50254e6686-config-data\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.425956 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/131ce515-ac42-4446-b075-5e50254e6686-public-tls-certs\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.426432 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn6tx\" (UniqueName: \"kubernetes.io/projected/131ce515-ac42-4446-b075-5e50254e6686-kube-api-access-rn6tx\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.427764 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/131ce515-ac42-4446-b075-5e50254e6686-internal-tls-certs\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.429014 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ce515-ac42-4446-b075-5e50254e6686-combined-ca-bundle\") pod \"placement-78454fb4-ktvqp\" (UID: \"131ce515-ac42-4446-b075-5e50254e6686\") " pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:18 crc kubenswrapper[4753]: I1005 20:31:18.521174 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:19 crc kubenswrapper[4753]: I1005 20:31:19.084383 4753 generic.go:334] "Generic (PLEG): container finished" podID="04bd2ecd-c468-4efa-877b-983c27dde353" containerID="3bd2a438f86333ebfd71c5c63d94276e3cb4fe2084ff7a5c42beaabddbd3988e" exitCode=0 Oct 05 20:31:19 crc kubenswrapper[4753]: I1005 20:31:19.084422 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7btxb" event={"ID":"04bd2ecd-c468-4efa-877b-983c27dde353","Type":"ContainerDied","Data":"3bd2a438f86333ebfd71c5c63d94276e3cb4fe2084ff7a5c42beaabddbd3988e"} Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.512429 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.677532 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-fernet-keys\") pod \"04bd2ecd-c468-4efa-877b-983c27dde353\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.677598 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-scripts\") pod \"04bd2ecd-c468-4efa-877b-983c27dde353\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.677696 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-credential-keys\") pod \"04bd2ecd-c468-4efa-877b-983c27dde353\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.677756 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-combined-ca-bundle\") pod \"04bd2ecd-c468-4efa-877b-983c27dde353\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.677848 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d56r2\" (UniqueName: \"kubernetes.io/projected/04bd2ecd-c468-4efa-877b-983c27dde353-kube-api-access-d56r2\") pod \"04bd2ecd-c468-4efa-877b-983c27dde353\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.677874 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-config-data\") pod \"04bd2ecd-c468-4efa-877b-983c27dde353\" (UID: \"04bd2ecd-c468-4efa-877b-983c27dde353\") " Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.690328 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "04bd2ecd-c468-4efa-877b-983c27dde353" (UID: "04bd2ecd-c468-4efa-877b-983c27dde353"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.690382 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04bd2ecd-c468-4efa-877b-983c27dde353-kube-api-access-d56r2" (OuterVolumeSpecName: "kube-api-access-d56r2") pod "04bd2ecd-c468-4efa-877b-983c27dde353" (UID: "04bd2ecd-c468-4efa-877b-983c27dde353"). InnerVolumeSpecName "kube-api-access-d56r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.702823 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-scripts" (OuterVolumeSpecName: "scripts") pod "04bd2ecd-c468-4efa-877b-983c27dde353" (UID: "04bd2ecd-c468-4efa-877b-983c27dde353"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.708228 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04bd2ecd-c468-4efa-877b-983c27dde353" (UID: "04bd2ecd-c468-4efa-877b-983c27dde353"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.709093 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "04bd2ecd-c468-4efa-877b-983c27dde353" (UID: "04bd2ecd-c468-4efa-877b-983c27dde353"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.721419 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-config-data" (OuterVolumeSpecName: "config-data") pod "04bd2ecd-c468-4efa-877b-983c27dde353" (UID: "04bd2ecd-c468-4efa-877b-983c27dde353"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.779309 4753 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.779342 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.779351 4753 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.779361 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.779370 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d56r2\" (UniqueName: \"kubernetes.io/projected/04bd2ecd-c468-4efa-877b-983c27dde353-kube-api-access-d56r2\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:22 crc kubenswrapper[4753]: I1005 20:31:22.779379 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bd2ecd-c468-4efa-877b-983c27dde353-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.157607 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7btxb" event={"ID":"04bd2ecd-c468-4efa-877b-983c27dde353","Type":"ContainerDied","Data":"a57f5bec87864d6cb8f5081178714ac05a4d7a55ab7f11e72278a751202c2bce"} Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.157698 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a57f5bec87864d6cb8f5081178714ac05a4d7a55ab7f11e72278a751202c2bce" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.157750 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7btxb" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.706481 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-85d68b7848-bh92h"] Oct 05 20:31:23 crc kubenswrapper[4753]: E1005 20:31:23.707996 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04bd2ecd-c468-4efa-877b-983c27dde353" containerName="keystone-bootstrap" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.708100 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="04bd2ecd-c468-4efa-877b-983c27dde353" containerName="keystone-bootstrap" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.708412 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="04bd2ecd-c468-4efa-877b-983c27dde353" containerName="keystone-bootstrap" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.709313 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.717720 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.718208 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.718420 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.718695 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2mmgx" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.719003 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.719402 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.770530 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-85d68b7848-bh92h"] Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.797545 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-internal-tls-certs\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.797938 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-credential-keys\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.798096 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-scripts\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.798248 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-fernet-keys\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.798369 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-combined-ca-bundle\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.798474 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-public-tls-certs\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.798573 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p8v9\" (UniqueName: \"kubernetes.io/projected/bdbb6c59-3c98-4b88-a1aa-7304476a522a-kube-api-access-7p8v9\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.798942 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-config-data\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.900175 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-config-data\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.901022 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-internal-tls-certs\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.901103 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-credential-keys\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.901277 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-scripts\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.901340 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-fernet-keys\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.901384 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-combined-ca-bundle\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.901410 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-public-tls-certs\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.901438 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p8v9\" (UniqueName: \"kubernetes.io/projected/bdbb6c59-3c98-4b88-a1aa-7304476a522a-kube-api-access-7p8v9\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.905481 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-config-data\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.906550 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-credential-keys\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.906606 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-public-tls-certs\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.906613 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-combined-ca-bundle\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.907118 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-internal-tls-certs\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.907382 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-scripts\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.908693 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bdbb6c59-3c98-4b88-a1aa-7304476a522a-fernet-keys\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:23 crc kubenswrapper[4753]: I1005 20:31:23.921939 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p8v9\" (UniqueName: \"kubernetes.io/projected/bdbb6c59-3c98-4b88-a1aa-7304476a522a-kube-api-access-7p8v9\") pod \"keystone-85d68b7848-bh92h\" (UID: \"bdbb6c59-3c98-4b88-a1aa-7304476a522a\") " pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:24 crc kubenswrapper[4753]: I1005 20:31:24.028869 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:32 crc kubenswrapper[4753]: E1005 20:31:32.855421 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b0c0824763cbfd23b836ee4355015d9f94daa115bcc9ef0ea8b8e8980d5a6213" Oct 05 20:31:32 crc kubenswrapper[4753]: E1005 20:31:32.856063 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b0c0824763cbfd23b836ee4355015d9f94daa115bcc9ef0ea8b8e8980d5a6213,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8mj2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-tnf7s_openstack(e3841483-2af9-40d8-8197-531d8dd1e57f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 20:31:32 crc kubenswrapper[4753]: E1005 20:31:32.857358 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-tnf7s" podUID="e3841483-2af9-40d8-8197-531d8dd1e57f" Oct 05 20:31:33 crc kubenswrapper[4753]: I1005 20:31:33.239419 4753 generic.go:334] "Generic (PLEG): container finished" podID="c440ae46-1143-44a6-8443-1b4c27fda1d1" containerID="43014cc60553b045db9e915b5f2ec0958cf6323fcd52f2246bbd6f261ebe0483" exitCode=0 Oct 05 20:31:33 crc kubenswrapper[4753]: I1005 20:31:33.239499 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w27gb" event={"ID":"c440ae46-1143-44a6-8443-1b4c27fda1d1","Type":"ContainerDied","Data":"43014cc60553b045db9e915b5f2ec0958cf6323fcd52f2246bbd6f261ebe0483"} Oct 05 20:31:33 crc kubenswrapper[4753]: E1005 20:31:33.243281 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b0c0824763cbfd23b836ee4355015d9f94daa115bcc9ef0ea8b8e8980d5a6213\\\"\"" pod="openstack/cinder-db-sync-tnf7s" podUID="e3841483-2af9-40d8-8197-531d8dd1e57f" Oct 05 20:31:34 crc kubenswrapper[4753]: I1005 20:31:34.489821 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:31:34 crc kubenswrapper[4753]: I1005 20:31:34.490298 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:31:34 crc kubenswrapper[4753]: I1005 20:31:34.490357 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:31:34 crc kubenswrapper[4753]: I1005 20:31:34.491055 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4bd1799dc562bea716f28381e5b80e668ae3afe82aa4afc891b52d8b6b3b6337"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 20:31:34 crc kubenswrapper[4753]: I1005 20:31:34.491113 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://4bd1799dc562bea716f28381e5b80e668ae3afe82aa4afc891b52d8b6b3b6337" gracePeriod=600 Oct 05 20:31:35 crc kubenswrapper[4753]: I1005 20:31:35.259302 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="4bd1799dc562bea716f28381e5b80e668ae3afe82aa4afc891b52d8b6b3b6337" exitCode=0 Oct 05 20:31:35 crc kubenswrapper[4753]: I1005 20:31:35.259348 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"4bd1799dc562bea716f28381e5b80e668ae3afe82aa4afc891b52d8b6b3b6337"} Oct 05 20:31:35 crc kubenswrapper[4753]: I1005 20:31:35.259706 4753 scope.go:117] "RemoveContainer" containerID="52a63543bb254f48756b18656816abfb4df8b41bf216da0e9c35cd4d17058bd4" Oct 05 20:31:35 crc kubenswrapper[4753]: E1005 20:31:35.552081 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:7db347424a8c5998059c5bf84c86a1ef8d582d1ffe39f4887551f2ac85a4915f" Oct 05 20:31:35 crc kubenswrapper[4753]: E1005 20:31:35.552289 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:7db347424a8c5998059c5bf84c86a1ef8d582d1ffe39f4887551f2ac85a4915f,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j74jc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-b97mg_openstack(fa63f817-249e-416d-aa21-47fe6e04180c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 20:31:35 crc kubenswrapper[4753]: E1005 20:31:35.554895 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-b97mg" podUID="fa63f817-249e-416d-aa21-47fe6e04180c" Oct 05 20:31:35 crc kubenswrapper[4753]: I1005 20:31:35.722387 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w27gb" Oct 05 20:31:35 crc kubenswrapper[4753]: I1005 20:31:35.762627 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c440ae46-1143-44a6-8443-1b4c27fda1d1-combined-ca-bundle\") pod \"c440ae46-1143-44a6-8443-1b4c27fda1d1\" (UID: \"c440ae46-1143-44a6-8443-1b4c27fda1d1\") " Oct 05 20:31:35 crc kubenswrapper[4753]: I1005 20:31:35.762738 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c440ae46-1143-44a6-8443-1b4c27fda1d1-config\") pod \"c440ae46-1143-44a6-8443-1b4c27fda1d1\" (UID: \"c440ae46-1143-44a6-8443-1b4c27fda1d1\") " Oct 05 20:31:35 crc kubenswrapper[4753]: I1005 20:31:35.762814 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2xr\" (UniqueName: \"kubernetes.io/projected/c440ae46-1143-44a6-8443-1b4c27fda1d1-kube-api-access-bf2xr\") pod \"c440ae46-1143-44a6-8443-1b4c27fda1d1\" (UID: \"c440ae46-1143-44a6-8443-1b4c27fda1d1\") " Oct 05 20:31:35 crc kubenswrapper[4753]: I1005 20:31:35.773377 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c440ae46-1143-44a6-8443-1b4c27fda1d1-kube-api-access-bf2xr" (OuterVolumeSpecName: "kube-api-access-bf2xr") pod "c440ae46-1143-44a6-8443-1b4c27fda1d1" (UID: "c440ae46-1143-44a6-8443-1b4c27fda1d1"). InnerVolumeSpecName "kube-api-access-bf2xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:31:35 crc kubenswrapper[4753]: I1005 20:31:35.803047 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c440ae46-1143-44a6-8443-1b4c27fda1d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c440ae46-1143-44a6-8443-1b4c27fda1d1" (UID: "c440ae46-1143-44a6-8443-1b4c27fda1d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:35 crc kubenswrapper[4753]: I1005 20:31:35.818798 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c440ae46-1143-44a6-8443-1b4c27fda1d1-config" (OuterVolumeSpecName: "config") pod "c440ae46-1143-44a6-8443-1b4c27fda1d1" (UID: "c440ae46-1143-44a6-8443-1b4c27fda1d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:35 crc kubenswrapper[4753]: I1005 20:31:35.864795 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c440ae46-1143-44a6-8443-1b4c27fda1d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:35 crc kubenswrapper[4753]: I1005 20:31:35.864823 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c440ae46-1143-44a6-8443-1b4c27fda1d1-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:35 crc kubenswrapper[4753]: I1005 20:31:35.864832 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2xr\" (UniqueName: \"kubernetes.io/projected/c440ae46-1143-44a6-8443-1b4c27fda1d1-kube-api-access-bf2xr\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:36 crc kubenswrapper[4753]: I1005 20:31:36.083380 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-85d68b7848-bh92h"] Oct 05 20:31:36 crc kubenswrapper[4753]: W1005 20:31:36.095345 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdbb6c59_3c98_4b88_a1aa_7304476a522a.slice/crio-41c607d684e7d0570e4416824a3aacf2dd70c3519e07a5779695f4ecdd4b2bed WatchSource:0}: Error finding container 41c607d684e7d0570e4416824a3aacf2dd70c3519e07a5779695f4ecdd4b2bed: Status 404 returned error can't find the container with id 41c607d684e7d0570e4416824a3aacf2dd70c3519e07a5779695f4ecdd4b2bed Oct 05 20:31:36 crc kubenswrapper[4753]: I1005 20:31:36.096202 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78454fb4-ktvqp"] Oct 05 20:31:36 crc kubenswrapper[4753]: I1005 20:31:36.275427 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de197a05-86cb-4de6-b4f7-03e27dba1e02","Type":"ContainerStarted","Data":"52c1174d5cf59852f61e66a223d8e2b42f94b261f1b765a89b38dea23d972ee2"} Oct 05 20:31:36 crc kubenswrapper[4753]: I1005 20:31:36.278392 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w27gb" event={"ID":"c440ae46-1143-44a6-8443-1b4c27fda1d1","Type":"ContainerDied","Data":"132dde30d8726696f05aa9028d4dd4c5eba5af3e5a071b84098ecfce815d5037"} Oct 05 20:31:36 crc kubenswrapper[4753]: I1005 20:31:36.278462 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="132dde30d8726696f05aa9028d4dd4c5eba5af3e5a071b84098ecfce815d5037" Oct 05 20:31:36 crc kubenswrapper[4753]: I1005 20:31:36.277607 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w27gb" Oct 05 20:31:36 crc kubenswrapper[4753]: I1005 20:31:36.280037 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78454fb4-ktvqp" event={"ID":"131ce515-ac42-4446-b075-5e50254e6686","Type":"ContainerStarted","Data":"f9f11c30ebafe92b9a8eb6490fc0fafe2e25410aae1d8b5bd1c0ceb18b060fba"} Oct 05 20:31:36 crc kubenswrapper[4753]: I1005 20:31:36.283434 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-85d68b7848-bh92h" event={"ID":"bdbb6c59-3c98-4b88-a1aa-7304476a522a","Type":"ContainerStarted","Data":"41c607d684e7d0570e4416824a3aacf2dd70c3519e07a5779695f4ecdd4b2bed"} Oct 05 20:31:36 crc kubenswrapper[4753]: I1005 20:31:36.297995 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"6035d1f392035c4d85f80611de9c6434a6132b34d7431e03335a2036e9d1df2f"} Oct 05 20:31:36 crc kubenswrapper[4753]: E1005 20:31:36.299720 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:7db347424a8c5998059c5bf84c86a1ef8d582d1ffe39f4887551f2ac85a4915f\\\"\"" pod="openstack/barbican-db-sync-b97mg" podUID="fa63f817-249e-416d-aa21-47fe6e04180c" Oct 05 20:31:36 crc kubenswrapper[4753]: I1005 20:31:36.972419 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f69d7557-flmx9"] Oct 05 20:31:36 crc kubenswrapper[4753]: E1005 20:31:36.973620 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c440ae46-1143-44a6-8443-1b4c27fda1d1" containerName="neutron-db-sync" Oct 05 20:31:36 crc kubenswrapper[4753]: I1005 20:31:36.973641 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c440ae46-1143-44a6-8443-1b4c27fda1d1" containerName="neutron-db-sync" Oct 05 20:31:36 crc kubenswrapper[4753]: I1005 20:31:36.973816 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c440ae46-1143-44a6-8443-1b4c27fda1d1" containerName="neutron-db-sync" Oct 05 20:31:36 crc kubenswrapper[4753]: I1005 20:31:36.974790 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:36 crc kubenswrapper[4753]: I1005 20:31:36.986874 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f69d7557-flmx9"] Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.087760 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-ovsdbserver-sb\") pod \"dnsmasq-dns-6f69d7557-flmx9\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.087829 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-ovsdbserver-nb\") pod \"dnsmasq-dns-6f69d7557-flmx9\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.087887 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2hk5\" (UniqueName: \"kubernetes.io/projected/1cc6a54b-8439-4745-8b04-98f4d28d06b4-kube-api-access-b2hk5\") pod \"dnsmasq-dns-6f69d7557-flmx9\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.087921 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-dns-svc\") pod \"dnsmasq-dns-6f69d7557-flmx9\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.087957 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-config\") pod \"dnsmasq-dns-6f69d7557-flmx9\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.127588 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-866d77d954-p4qfh"] Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.128781 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.132839 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vwlnw" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.133086 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.133362 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.133512 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.156782 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-866d77d954-p4qfh"] Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.191007 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2hk5\" (UniqueName: \"kubernetes.io/projected/1cc6a54b-8439-4745-8b04-98f4d28d06b4-kube-api-access-b2hk5\") pod \"dnsmasq-dns-6f69d7557-flmx9\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.191367 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-dns-svc\") pod \"dnsmasq-dns-6f69d7557-flmx9\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.191422 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-config\") pod \"dnsmasq-dns-6f69d7557-flmx9\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.191460 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-ovsdbserver-sb\") pod \"dnsmasq-dns-6f69d7557-flmx9\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.191507 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-ovsdbserver-nb\") pod \"dnsmasq-dns-6f69d7557-flmx9\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.192361 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-ovsdbserver-nb\") pod \"dnsmasq-dns-6f69d7557-flmx9\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.192441 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-config\") pod \"dnsmasq-dns-6f69d7557-flmx9\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.192852 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-ovsdbserver-sb\") pod \"dnsmasq-dns-6f69d7557-flmx9\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.193363 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-dns-svc\") pod \"dnsmasq-dns-6f69d7557-flmx9\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.211394 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2hk5\" (UniqueName: \"kubernetes.io/projected/1cc6a54b-8439-4745-8b04-98f4d28d06b4-kube-api-access-b2hk5\") pod \"dnsmasq-dns-6f69d7557-flmx9\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.292559 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mrtl\" (UniqueName: \"kubernetes.io/projected/c507e884-7868-46c1-b89a-e8ee71f3e8e1-kube-api-access-2mrtl\") pod \"neutron-866d77d954-p4qfh\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.293314 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-combined-ca-bundle\") pod \"neutron-866d77d954-p4qfh\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.293344 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-ovndb-tls-certs\") pod \"neutron-866d77d954-p4qfh\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.293380 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-httpd-config\") pod \"neutron-866d77d954-p4qfh\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.293423 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-config\") pod \"neutron-866d77d954-p4qfh\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.298301 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.309761 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78454fb4-ktvqp" event={"ID":"131ce515-ac42-4446-b075-5e50254e6686","Type":"ContainerStarted","Data":"8e80cec114b05379d3e7e682424bc913e471619e4bddfd2dcd170e1fc74fc804"} Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.309798 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78454fb4-ktvqp" event={"ID":"131ce515-ac42-4446-b075-5e50254e6686","Type":"ContainerStarted","Data":"3677aae62c826e616da1259c7b2b1376c13fa66508bbee22ad7d40be1a730b34"} Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.310630 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.310654 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.313450 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-85d68b7848-bh92h" event={"ID":"bdbb6c59-3c98-4b88-a1aa-7304476a522a","Type":"ContainerStarted","Data":"416b1c70a68d42b03d2eb570e0bdcea0414e80b9c621d91d5ef4548e3e148103"} Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.313477 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.335732 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-78454fb4-ktvqp" podStartSLOduration=19.33571624 podStartE2EDuration="19.33571624s" podCreationTimestamp="2025-10-05 20:31:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:31:37.332786339 +0000 UTC m=+1006.181114571" watchObservedRunningTime="2025-10-05 20:31:37.33571624 +0000 UTC m=+1006.184044472" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.395291 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-combined-ca-bundle\") pod \"neutron-866d77d954-p4qfh\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.395332 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-ovndb-tls-certs\") pod \"neutron-866d77d954-p4qfh\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.395370 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-httpd-config\") pod \"neutron-866d77d954-p4qfh\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.395409 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-config\") pod \"neutron-866d77d954-p4qfh\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.395455 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mrtl\" (UniqueName: \"kubernetes.io/projected/c507e884-7868-46c1-b89a-e8ee71f3e8e1-kube-api-access-2mrtl\") pod \"neutron-866d77d954-p4qfh\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.407070 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-config\") pod \"neutron-866d77d954-p4qfh\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.407960 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-combined-ca-bundle\") pod \"neutron-866d77d954-p4qfh\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.416055 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-ovndb-tls-certs\") pod \"neutron-866d77d954-p4qfh\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.416863 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mrtl\" (UniqueName: \"kubernetes.io/projected/c507e884-7868-46c1-b89a-e8ee71f3e8e1-kube-api-access-2mrtl\") pod \"neutron-866d77d954-p4qfh\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.418080 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-httpd-config\") pod \"neutron-866d77d954-p4qfh\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.450775 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.910365 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-85d68b7848-bh92h" podStartSLOduration=14.910346448 podStartE2EDuration="14.910346448s" podCreationTimestamp="2025-10-05 20:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:31:37.359664359 +0000 UTC m=+1006.207992591" watchObservedRunningTime="2025-10-05 20:31:37.910346448 +0000 UTC m=+1006.758674680" Oct 05 20:31:37 crc kubenswrapper[4753]: I1005 20:31:37.911557 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f69d7557-flmx9"] Oct 05 20:31:37 crc kubenswrapper[4753]: W1005 20:31:37.920310 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cc6a54b_8439_4745_8b04_98f4d28d06b4.slice/crio-fc26f612b471ffd2d7a301679979ccb83ce66d36891bca87dd5e9e0e8a8546db WatchSource:0}: Error finding container fc26f612b471ffd2d7a301679979ccb83ce66d36891bca87dd5e9e0e8a8546db: Status 404 returned error can't find the container with id fc26f612b471ffd2d7a301679979ccb83ce66d36891bca87dd5e9e0e8a8546db Oct 05 20:31:38 crc kubenswrapper[4753]: I1005 20:31:38.033505 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-866d77d954-p4qfh"] Oct 05 20:31:38 crc kubenswrapper[4753]: W1005 20:31:38.045206 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc507e884_7868_46c1_b89a_e8ee71f3e8e1.slice/crio-6946ac6b844ab9a07adbc4751d09b754f4040dc519fe326119d59397fcd17007 WatchSource:0}: Error finding container 6946ac6b844ab9a07adbc4751d09b754f4040dc519fe326119d59397fcd17007: Status 404 returned error can't find the container with id 6946ac6b844ab9a07adbc4751d09b754f4040dc519fe326119d59397fcd17007 Oct 05 20:31:38 crc kubenswrapper[4753]: I1005 20:31:38.337293 4753 generic.go:334] "Generic (PLEG): container finished" podID="1cc6a54b-8439-4745-8b04-98f4d28d06b4" containerID="1d3c1c6b7888f801f218df3c50f8626e6c41bea6ec02aa1c20717c14baf0f0b8" exitCode=0 Oct 05 20:31:38 crc kubenswrapper[4753]: I1005 20:31:38.337392 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f69d7557-flmx9" event={"ID":"1cc6a54b-8439-4745-8b04-98f4d28d06b4","Type":"ContainerDied","Data":"1d3c1c6b7888f801f218df3c50f8626e6c41bea6ec02aa1c20717c14baf0f0b8"} Oct 05 20:31:38 crc kubenswrapper[4753]: I1005 20:31:38.337944 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f69d7557-flmx9" event={"ID":"1cc6a54b-8439-4745-8b04-98f4d28d06b4","Type":"ContainerStarted","Data":"fc26f612b471ffd2d7a301679979ccb83ce66d36891bca87dd5e9e0e8a8546db"} Oct 05 20:31:38 crc kubenswrapper[4753]: I1005 20:31:38.347957 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-866d77d954-p4qfh" event={"ID":"c507e884-7868-46c1-b89a-e8ee71f3e8e1","Type":"ContainerStarted","Data":"b69237393cc61b3255df8f36afafb6d480e49844867db99dd24af1a55c4a796b"} Oct 05 20:31:38 crc kubenswrapper[4753]: I1005 20:31:38.350105 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-866d77d954-p4qfh" event={"ID":"c507e884-7868-46c1-b89a-e8ee71f3e8e1","Type":"ContainerStarted","Data":"6946ac6b844ab9a07adbc4751d09b754f4040dc519fe326119d59397fcd17007"} Oct 05 20:31:39 crc kubenswrapper[4753]: I1005 20:31:39.362848 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f69d7557-flmx9" event={"ID":"1cc6a54b-8439-4745-8b04-98f4d28d06b4","Type":"ContainerStarted","Data":"707aaa5017e0cbe44a45fd815e9789c08c47f75e9bba9299091ae1d5d88916bd"} Oct 05 20:31:39 crc kubenswrapper[4753]: I1005 20:31:39.364153 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:39 crc kubenswrapper[4753]: I1005 20:31:39.374889 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-866d77d954-p4qfh" event={"ID":"c507e884-7868-46c1-b89a-e8ee71f3e8e1","Type":"ContainerStarted","Data":"90f7763ad014d54f29c0606d7396bbc5c87ea44fb26910e2a8c081b7db9f1104"} Oct 05 20:31:39 crc kubenswrapper[4753]: I1005 20:31:39.384164 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f69d7557-flmx9" podStartSLOduration=3.384146824 podStartE2EDuration="3.384146824s" podCreationTimestamp="2025-10-05 20:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:31:39.380251493 +0000 UTC m=+1008.228579725" watchObservedRunningTime="2025-10-05 20:31:39.384146824 +0000 UTC m=+1008.232475056" Oct 05 20:31:39 crc kubenswrapper[4753]: I1005 20:31:39.399157 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-866d77d954-p4qfh" podStartSLOduration=2.399128583 podStartE2EDuration="2.399128583s" podCreationTimestamp="2025-10-05 20:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:31:39.39458074 +0000 UTC m=+1008.242908972" watchObservedRunningTime="2025-10-05 20:31:39.399128583 +0000 UTC m=+1008.247456815" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.091865 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-79cfb6d465-74j5v"] Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.093205 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.101387 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.101955 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.117126 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79cfb6d465-74j5v"] Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.251885 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm8rb\" (UniqueName: \"kubernetes.io/projected/e26c6617-558f-445a-be5b-02578e006437-kube-api-access-lm8rb\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.251971 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-httpd-config\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.252002 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-internal-tls-certs\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.252035 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-public-tls-certs\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.252059 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-config\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.252101 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-combined-ca-bundle\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.252123 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-ovndb-tls-certs\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.353377 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-combined-ca-bundle\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.353423 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-ovndb-tls-certs\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.353464 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm8rb\" (UniqueName: \"kubernetes.io/projected/e26c6617-558f-445a-be5b-02578e006437-kube-api-access-lm8rb\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.353535 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-httpd-config\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.353562 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-internal-tls-certs\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.353583 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-public-tls-certs\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.353603 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-config\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.362427 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-config\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.363534 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-ovndb-tls-certs\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.364518 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-httpd-config\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.365605 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-combined-ca-bundle\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.368618 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-internal-tls-certs\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.373946 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e26c6617-558f-445a-be5b-02578e006437-public-tls-certs\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.377862 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm8rb\" (UniqueName: \"kubernetes.io/projected/e26c6617-558f-445a-be5b-02578e006437-kube-api-access-lm8rb\") pod \"neutron-79cfb6d465-74j5v\" (UID: \"e26c6617-558f-445a-be5b-02578e006437\") " pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.395482 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:31:40 crc kubenswrapper[4753]: I1005 20:31:40.415379 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:45 crc kubenswrapper[4753]: I1005 20:31:45.838485 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79cfb6d465-74j5v"] Oct 05 20:31:46 crc kubenswrapper[4753]: I1005 20:31:46.454876 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de197a05-86cb-4de6-b4f7-03e27dba1e02","Type":"ContainerStarted","Data":"4f6e58bba4c7b4929e262e1c0c467420e4b3de924a22186e597e841b9c9677b0"} Oct 05 20:31:46 crc kubenswrapper[4753]: I1005 20:31:46.455264 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerName="ceilometer-central-agent" containerID="cri-o://44a4f4ab27768bfc64dea4cc8500bdf65997de078dc781cdb323b3b8ce0b78f0" gracePeriod=30 Oct 05 20:31:46 crc kubenswrapper[4753]: I1005 20:31:46.455404 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 05 20:31:46 crc kubenswrapper[4753]: I1005 20:31:46.455489 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerName="proxy-httpd" containerID="cri-o://4f6e58bba4c7b4929e262e1c0c467420e4b3de924a22186e597e841b9c9677b0" gracePeriod=30 Oct 05 20:31:46 crc kubenswrapper[4753]: I1005 20:31:46.455535 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerName="sg-core" containerID="cri-o://52c1174d5cf59852f61e66a223d8e2b42f94b261f1b765a89b38dea23d972ee2" gracePeriod=30 Oct 05 20:31:46 crc kubenswrapper[4753]: I1005 20:31:46.455571 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerName="ceilometer-notification-agent" containerID="cri-o://db0ba4526062640baf7519ca063ea638d3449834d30bf58803214c5acf5dc485" gracePeriod=30 Oct 05 20:31:46 crc kubenswrapper[4753]: I1005 20:31:46.477576 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79cfb6d465-74j5v" event={"ID":"e26c6617-558f-445a-be5b-02578e006437","Type":"ContainerStarted","Data":"96f1e703ef19acbd49a0e5381334120a835fd0567af051b743ab3d633cbfeba3"} Oct 05 20:31:46 crc kubenswrapper[4753]: I1005 20:31:46.477654 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79cfb6d465-74j5v" event={"ID":"e26c6617-558f-445a-be5b-02578e006437","Type":"ContainerStarted","Data":"d25aed355be1c1985ca7ba08b7ca1a1babbc2469f955d556f8035d31da3acb84"} Oct 05 20:31:46 crc kubenswrapper[4753]: I1005 20:31:46.477668 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79cfb6d465-74j5v" event={"ID":"e26c6617-558f-445a-be5b-02578e006437","Type":"ContainerStarted","Data":"bf8194a6bf4a6b99efa23bc51bddc6563201b4096b60b4dccda5676daa6186e6"} Oct 05 20:31:46 crc kubenswrapper[4753]: I1005 20:31:46.478279 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:31:46 crc kubenswrapper[4753]: I1005 20:31:46.507632 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.130089823 podStartE2EDuration="46.507614903s" podCreationTimestamp="2025-10-05 20:31:00 +0000 UTC" firstStartedPulling="2025-10-05 20:31:02.045278053 +0000 UTC m=+970.893606285" lastFinishedPulling="2025-10-05 20:31:45.422803133 +0000 UTC m=+1014.271131365" observedRunningTime="2025-10-05 20:31:46.497061763 +0000 UTC m=+1015.345389995" watchObservedRunningTime="2025-10-05 20:31:46.507614903 +0000 UTC m=+1015.355943135" Oct 05 20:31:46 crc kubenswrapper[4753]: I1005 20:31:46.523067 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-79cfb6d465-74j5v" podStartSLOduration=6.523044225 podStartE2EDuration="6.523044225s" podCreationTimestamp="2025-10-05 20:31:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:31:46.521040532 +0000 UTC m=+1015.369368764" watchObservedRunningTime="2025-10-05 20:31:46.523044225 +0000 UTC m=+1015.371372457" Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.300185 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.351168 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d8869ff97-2bt5k"] Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.351391 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" podUID="ff380460-00c8-4b7e-b3e0-1d7fb18ed268" containerName="dnsmasq-dns" containerID="cri-o://b70d465c01a87ed4baec0992a924024a9f770e05dbbab822b47d2b9a06e39537" gracePeriod=10 Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.411845 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" podUID="ff380460-00c8-4b7e-b3e0-1d7fb18ed268" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.513516 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tnf7s" event={"ID":"e3841483-2af9-40d8-8197-531d8dd1e57f","Type":"ContainerStarted","Data":"9b48a93f66579c4f92855e2f8cdd79a3a8cf212397ad1595b8b848f6832d9b3c"} Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.520011 4753 generic.go:334] "Generic (PLEG): container finished" podID="ff380460-00c8-4b7e-b3e0-1d7fb18ed268" containerID="b70d465c01a87ed4baec0992a924024a9f770e05dbbab822b47d2b9a06e39537" exitCode=0 Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.520058 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" event={"ID":"ff380460-00c8-4b7e-b3e0-1d7fb18ed268","Type":"ContainerDied","Data":"b70d465c01a87ed4baec0992a924024a9f770e05dbbab822b47d2b9a06e39537"} Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.523893 4753 generic.go:334] "Generic (PLEG): container finished" podID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerID="4f6e58bba4c7b4929e262e1c0c467420e4b3de924a22186e597e841b9c9677b0" exitCode=0 Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.523932 4753 generic.go:334] "Generic (PLEG): container finished" podID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerID="52c1174d5cf59852f61e66a223d8e2b42f94b261f1b765a89b38dea23d972ee2" exitCode=2 Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.523940 4753 generic.go:334] "Generic (PLEG): container finished" podID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerID="db0ba4526062640baf7519ca063ea638d3449834d30bf58803214c5acf5dc485" exitCode=0 Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.523938 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de197a05-86cb-4de6-b4f7-03e27dba1e02","Type":"ContainerDied","Data":"4f6e58bba4c7b4929e262e1c0c467420e4b3de924a22186e597e841b9c9677b0"} Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.523995 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de197a05-86cb-4de6-b4f7-03e27dba1e02","Type":"ContainerDied","Data":"52c1174d5cf59852f61e66a223d8e2b42f94b261f1b765a89b38dea23d972ee2"} Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.524015 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de197a05-86cb-4de6-b4f7-03e27dba1e02","Type":"ContainerDied","Data":"db0ba4526062640baf7519ca063ea638d3449834d30bf58803214c5acf5dc485"} Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.524026 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de197a05-86cb-4de6-b4f7-03e27dba1e02","Type":"ContainerDied","Data":"44a4f4ab27768bfc64dea4cc8500bdf65997de078dc781cdb323b3b8ce0b78f0"} Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.523950 4753 generic.go:334] "Generic (PLEG): container finished" podID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerID="44a4f4ab27768bfc64dea4cc8500bdf65997de078dc781cdb323b3b8ce0b78f0" exitCode=0 Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.528439 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-tnf7s" podStartSLOduration=4.954412615 podStartE2EDuration="38.528425364s" podCreationTimestamp="2025-10-05 20:31:09 +0000 UTC" firstStartedPulling="2025-10-05 20:31:12.7376487 +0000 UTC m=+981.585976932" lastFinishedPulling="2025-10-05 20:31:46.311661419 +0000 UTC m=+1015.159989681" observedRunningTime="2025-10-05 20:31:47.52765489 +0000 UTC m=+1016.375983122" watchObservedRunningTime="2025-10-05 20:31:47.528425364 +0000 UTC m=+1016.376753596" Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.959273 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:47 crc kubenswrapper[4753]: I1005 20:31:47.966533 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.089714 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-dns-svc\") pod \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.089761 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-scripts\") pod \"de197a05-86cb-4de6-b4f7-03e27dba1e02\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.089789 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de197a05-86cb-4de6-b4f7-03e27dba1e02-run-httpd\") pod \"de197a05-86cb-4de6-b4f7-03e27dba1e02\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.089817 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-config\") pod \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.089840 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-config-data\") pod \"de197a05-86cb-4de6-b4f7-03e27dba1e02\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.089865 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-ovsdbserver-nb\") pod \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.089923 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2797l\" (UniqueName: \"kubernetes.io/projected/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-kube-api-access-2797l\") pod \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.089969 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de197a05-86cb-4de6-b4f7-03e27dba1e02-log-httpd\") pod \"de197a05-86cb-4de6-b4f7-03e27dba1e02\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.089989 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-combined-ca-bundle\") pod \"de197a05-86cb-4de6-b4f7-03e27dba1e02\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.090029 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-ovsdbserver-sb\") pod \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\" (UID: \"ff380460-00c8-4b7e-b3e0-1d7fb18ed268\") " Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.090081 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwqfg\" (UniqueName: \"kubernetes.io/projected/de197a05-86cb-4de6-b4f7-03e27dba1e02-kube-api-access-nwqfg\") pod \"de197a05-86cb-4de6-b4f7-03e27dba1e02\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.090115 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-sg-core-conf-yaml\") pod \"de197a05-86cb-4de6-b4f7-03e27dba1e02\" (UID: \"de197a05-86cb-4de6-b4f7-03e27dba1e02\") " Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.092261 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de197a05-86cb-4de6-b4f7-03e27dba1e02-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "de197a05-86cb-4de6-b4f7-03e27dba1e02" (UID: "de197a05-86cb-4de6-b4f7-03e27dba1e02"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.092651 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de197a05-86cb-4de6-b4f7-03e27dba1e02-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "de197a05-86cb-4de6-b4f7-03e27dba1e02" (UID: "de197a05-86cb-4de6-b4f7-03e27dba1e02"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.097055 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-scripts" (OuterVolumeSpecName: "scripts") pod "de197a05-86cb-4de6-b4f7-03e27dba1e02" (UID: "de197a05-86cb-4de6-b4f7-03e27dba1e02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.097885 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-kube-api-access-2797l" (OuterVolumeSpecName: "kube-api-access-2797l") pod "ff380460-00c8-4b7e-b3e0-1d7fb18ed268" (UID: "ff380460-00c8-4b7e-b3e0-1d7fb18ed268"). InnerVolumeSpecName "kube-api-access-2797l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.099367 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de197a05-86cb-4de6-b4f7-03e27dba1e02-kube-api-access-nwqfg" (OuterVolumeSpecName: "kube-api-access-nwqfg") pod "de197a05-86cb-4de6-b4f7-03e27dba1e02" (UID: "de197a05-86cb-4de6-b4f7-03e27dba1e02"). InnerVolumeSpecName "kube-api-access-nwqfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.138107 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "de197a05-86cb-4de6-b4f7-03e27dba1e02" (UID: "de197a05-86cb-4de6-b4f7-03e27dba1e02"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.148408 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff380460-00c8-4b7e-b3e0-1d7fb18ed268" (UID: "ff380460-00c8-4b7e-b3e0-1d7fb18ed268"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.151567 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff380460-00c8-4b7e-b3e0-1d7fb18ed268" (UID: "ff380460-00c8-4b7e-b3e0-1d7fb18ed268"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.154789 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-config" (OuterVolumeSpecName: "config") pod "ff380460-00c8-4b7e-b3e0-1d7fb18ed268" (UID: "ff380460-00c8-4b7e-b3e0-1d7fb18ed268"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.161953 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff380460-00c8-4b7e-b3e0-1d7fb18ed268" (UID: "ff380460-00c8-4b7e-b3e0-1d7fb18ed268"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.194529 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2797l\" (UniqueName: \"kubernetes.io/projected/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-kube-api-access-2797l\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.194557 4753 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de197a05-86cb-4de6-b4f7-03e27dba1e02-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.194566 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.194576 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwqfg\" (UniqueName: \"kubernetes.io/projected/de197a05-86cb-4de6-b4f7-03e27dba1e02-kube-api-access-nwqfg\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.194584 4753 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.194593 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.194601 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.194609 4753 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de197a05-86cb-4de6-b4f7-03e27dba1e02-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.194619 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.194626 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff380460-00c8-4b7e-b3e0-1d7fb18ed268-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.201255 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-config-data" (OuterVolumeSpecName: "config-data") pod "de197a05-86cb-4de6-b4f7-03e27dba1e02" (UID: "de197a05-86cb-4de6-b4f7-03e27dba1e02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.210300 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de197a05-86cb-4de6-b4f7-03e27dba1e02" (UID: "de197a05-86cb-4de6-b4f7-03e27dba1e02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.295930 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.295965 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de197a05-86cb-4de6-b4f7-03e27dba1e02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.533127 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" event={"ID":"ff380460-00c8-4b7e-b3e0-1d7fb18ed268","Type":"ContainerDied","Data":"76f0a33fd0049981fb130bcb059f7da2a4eece5fdecaa34436b86c6d2f5ec5ac"} Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.534040 4753 scope.go:117] "RemoveContainer" containerID="b70d465c01a87ed4baec0992a924024a9f770e05dbbab822b47d2b9a06e39537" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.533246 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8869ff97-2bt5k" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.540498 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de197a05-86cb-4de6-b4f7-03e27dba1e02","Type":"ContainerDied","Data":"988ebece8656993c27c5a1c60af8d23482c97ee990be05e17dfd5789790a54e0"} Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.540612 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.584696 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d8869ff97-2bt5k"] Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.601876 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d8869ff97-2bt5k"] Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.609825 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.622285 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.635234 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:31:48 crc kubenswrapper[4753]: E1005 20:31:48.635642 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff380460-00c8-4b7e-b3e0-1d7fb18ed268" containerName="dnsmasq-dns" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.635659 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff380460-00c8-4b7e-b3e0-1d7fb18ed268" containerName="dnsmasq-dns" Oct 05 20:31:48 crc kubenswrapper[4753]: E1005 20:31:48.635686 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerName="ceilometer-central-agent" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.635700 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerName="ceilometer-central-agent" Oct 05 20:31:48 crc kubenswrapper[4753]: E1005 20:31:48.635719 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerName="proxy-httpd" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.635729 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerName="proxy-httpd" Oct 05 20:31:48 crc kubenswrapper[4753]: E1005 20:31:48.635751 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerName="sg-core" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.635759 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerName="sg-core" Oct 05 20:31:48 crc kubenswrapper[4753]: E1005 20:31:48.635776 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerName="ceilometer-notification-agent" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.635785 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerName="ceilometer-notification-agent" Oct 05 20:31:48 crc kubenswrapper[4753]: E1005 20:31:48.635805 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff380460-00c8-4b7e-b3e0-1d7fb18ed268" containerName="init" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.635813 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff380460-00c8-4b7e-b3e0-1d7fb18ed268" containerName="init" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.636015 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff380460-00c8-4b7e-b3e0-1d7fb18ed268" containerName="dnsmasq-dns" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.636027 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerName="proxy-httpd" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.636060 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerName="ceilometer-central-agent" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.636076 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerName="sg-core" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.636104 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" containerName="ceilometer-notification-agent" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.637979 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.643255 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.643588 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.651253 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.804193 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.804332 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph7hf\" (UniqueName: \"kubernetes.io/projected/17c5fb23-4894-4cf6-a12c-c698a36c6450-kube-api-access-ph7hf\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.804356 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.804394 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-scripts\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.804443 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17c5fb23-4894-4cf6-a12c-c698a36c6450-log-httpd\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.804506 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-config-data\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.804560 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17c5fb23-4894-4cf6-a12c-c698a36c6450-run-httpd\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.828314 4753 scope.go:117] "RemoveContainer" containerID="1e5582aceea0130d0cd2cad433ca22e90d4de87aa5b710c6753dd08d8bdee954" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.906337 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph7hf\" (UniqueName: \"kubernetes.io/projected/17c5fb23-4894-4cf6-a12c-c698a36c6450-kube-api-access-ph7hf\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.906375 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.906401 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-scripts\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.906420 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17c5fb23-4894-4cf6-a12c-c698a36c6450-log-httpd\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.906439 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-config-data\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.906465 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17c5fb23-4894-4cf6-a12c-c698a36c6450-run-httpd\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.906532 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.907523 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17c5fb23-4894-4cf6-a12c-c698a36c6450-run-httpd\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.907519 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17c5fb23-4894-4cf6-a12c-c698a36c6450-log-httpd\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.908499 4753 scope.go:117] "RemoveContainer" containerID="4f6e58bba4c7b4929e262e1c0c467420e4b3de924a22186e597e841b9c9677b0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.910016 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.910353 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.912483 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-scripts\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.922014 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-config-data\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.924749 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph7hf\" (UniqueName: \"kubernetes.io/projected/17c5fb23-4894-4cf6-a12c-c698a36c6450-kube-api-access-ph7hf\") pod \"ceilometer-0\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " pod="openstack/ceilometer-0" Oct 05 20:31:48 crc kubenswrapper[4753]: I1005 20:31:48.960689 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:31:49 crc kubenswrapper[4753]: I1005 20:31:49.025242 4753 scope.go:117] "RemoveContainer" containerID="52c1174d5cf59852f61e66a223d8e2b42f94b261f1b765a89b38dea23d972ee2" Oct 05 20:31:49 crc kubenswrapper[4753]: I1005 20:31:49.055563 4753 scope.go:117] "RemoveContainer" containerID="db0ba4526062640baf7519ca063ea638d3449834d30bf58803214c5acf5dc485" Oct 05 20:31:49 crc kubenswrapper[4753]: I1005 20:31:49.080622 4753 scope.go:117] "RemoveContainer" containerID="44a4f4ab27768bfc64dea4cc8500bdf65997de078dc781cdb323b3b8ce0b78f0" Oct 05 20:31:49 crc kubenswrapper[4753]: I1005 20:31:49.461815 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:31:49 crc kubenswrapper[4753]: I1005 20:31:49.550864 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17c5fb23-4894-4cf6-a12c-c698a36c6450","Type":"ContainerStarted","Data":"bd2c030521eb6a8e98d1ecd14f0bd132c109c017e6c73122e853150120213688"} Oct 05 20:31:49 crc kubenswrapper[4753]: I1005 20:31:49.862656 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de197a05-86cb-4de6-b4f7-03e27dba1e02" path="/var/lib/kubelet/pods/de197a05-86cb-4de6-b4f7-03e27dba1e02/volumes" Oct 05 20:31:49 crc kubenswrapper[4753]: I1005 20:31:49.863424 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff380460-00c8-4b7e-b3e0-1d7fb18ed268" path="/var/lib/kubelet/pods/ff380460-00c8-4b7e-b3e0-1d7fb18ed268/volumes" Oct 05 20:31:50 crc kubenswrapper[4753]: I1005 20:31:50.322515 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:50 crc kubenswrapper[4753]: I1005 20:31:50.326255 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-78454fb4-ktvqp" Oct 05 20:31:50 crc kubenswrapper[4753]: I1005 20:31:50.564385 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17c5fb23-4894-4cf6-a12c-c698a36c6450","Type":"ContainerStarted","Data":"bc56ade412c71d62989a39d6e00e1ad36d72e3985c60278b3113e7adcd1cee5b"} Oct 05 20:31:51 crc kubenswrapper[4753]: I1005 20:31:51.577817 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b97mg" event={"ID":"fa63f817-249e-416d-aa21-47fe6e04180c","Type":"ContainerStarted","Data":"059fa7c26c414539ba189c6a39977d8007010aaa19369d67671d694f889828b7"} Oct 05 20:31:51 crc kubenswrapper[4753]: I1005 20:31:51.584609 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17c5fb23-4894-4cf6-a12c-c698a36c6450","Type":"ContainerStarted","Data":"459a708a60fda760e18b6910009580528f842822d6bbb458960969d25dd88f45"} Oct 05 20:31:51 crc kubenswrapper[4753]: I1005 20:31:51.584869 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17c5fb23-4894-4cf6-a12c-c698a36c6450","Type":"ContainerStarted","Data":"81cae57410065c718a3e174a14d772660043a6a78543a94b4ca1e47bd0938c4d"} Oct 05 20:31:51 crc kubenswrapper[4753]: I1005 20:31:51.600618 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-b97mg" podStartSLOduration=1.84739817 podStartE2EDuration="37.600594079s" podCreationTimestamp="2025-10-05 20:31:14 +0000 UTC" firstStartedPulling="2025-10-05 20:31:15.561016381 +0000 UTC m=+984.409344613" lastFinishedPulling="2025-10-05 20:31:51.31421229 +0000 UTC m=+1020.162540522" observedRunningTime="2025-10-05 20:31:51.596043506 +0000 UTC m=+1020.444371738" watchObservedRunningTime="2025-10-05 20:31:51.600594079 +0000 UTC m=+1020.448922331" Oct 05 20:31:53 crc kubenswrapper[4753]: I1005 20:31:53.613113 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17c5fb23-4894-4cf6-a12c-c698a36c6450","Type":"ContainerStarted","Data":"00c099a79a982ca0152e2f01be1f6f641f4d833e6463861f5fd79273026ef46f"} Oct 05 20:31:53 crc kubenswrapper[4753]: I1005 20:31:53.614704 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 05 20:31:53 crc kubenswrapper[4753]: I1005 20:31:53.616179 4753 generic.go:334] "Generic (PLEG): container finished" podID="e3841483-2af9-40d8-8197-531d8dd1e57f" containerID="9b48a93f66579c4f92855e2f8cdd79a3a8cf212397ad1595b8b848f6832d9b3c" exitCode=0 Oct 05 20:31:53 crc kubenswrapper[4753]: I1005 20:31:53.616205 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tnf7s" event={"ID":"e3841483-2af9-40d8-8197-531d8dd1e57f","Type":"ContainerDied","Data":"9b48a93f66579c4f92855e2f8cdd79a3a8cf212397ad1595b8b848f6832d9b3c"} Oct 05 20:31:53 crc kubenswrapper[4753]: I1005 20:31:53.644849 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.31774425 podStartE2EDuration="5.644831781s" podCreationTimestamp="2025-10-05 20:31:48 +0000 UTC" firstStartedPulling="2025-10-05 20:31:49.472250768 +0000 UTC m=+1018.320579000" lastFinishedPulling="2025-10-05 20:31:52.799338299 +0000 UTC m=+1021.647666531" observedRunningTime="2025-10-05 20:31:53.634525859 +0000 UTC m=+1022.482854111" watchObservedRunningTime="2025-10-05 20:31:53.644831781 +0000 UTC m=+1022.493160013" Oct 05 20:31:54 crc kubenswrapper[4753]: I1005 20:31:54.627452 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa63f817-249e-416d-aa21-47fe6e04180c" containerID="059fa7c26c414539ba189c6a39977d8007010aaa19369d67671d694f889828b7" exitCode=0 Oct 05 20:31:54 crc kubenswrapper[4753]: I1005 20:31:54.627565 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b97mg" event={"ID":"fa63f817-249e-416d-aa21-47fe6e04180c","Type":"ContainerDied","Data":"059fa7c26c414539ba189c6a39977d8007010aaa19369d67671d694f889828b7"} Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.044802 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.119783 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-combined-ca-bundle\") pod \"e3841483-2af9-40d8-8197-531d8dd1e57f\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.119854 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mj2v\" (UniqueName: \"kubernetes.io/projected/e3841483-2af9-40d8-8197-531d8dd1e57f-kube-api-access-8mj2v\") pod \"e3841483-2af9-40d8-8197-531d8dd1e57f\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.119877 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-db-sync-config-data\") pod \"e3841483-2af9-40d8-8197-531d8dd1e57f\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.119909 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-scripts\") pod \"e3841483-2af9-40d8-8197-531d8dd1e57f\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.119935 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3841483-2af9-40d8-8197-531d8dd1e57f-etc-machine-id\") pod \"e3841483-2af9-40d8-8197-531d8dd1e57f\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.120119 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-config-data\") pod \"e3841483-2af9-40d8-8197-531d8dd1e57f\" (UID: \"e3841483-2af9-40d8-8197-531d8dd1e57f\") " Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.120206 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3841483-2af9-40d8-8197-531d8dd1e57f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e3841483-2af9-40d8-8197-531d8dd1e57f" (UID: "e3841483-2af9-40d8-8197-531d8dd1e57f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.120953 4753 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3841483-2af9-40d8-8197-531d8dd1e57f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.125040 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e3841483-2af9-40d8-8197-531d8dd1e57f" (UID: "e3841483-2af9-40d8-8197-531d8dd1e57f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.125608 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-scripts" (OuterVolumeSpecName: "scripts") pod "e3841483-2af9-40d8-8197-531d8dd1e57f" (UID: "e3841483-2af9-40d8-8197-531d8dd1e57f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.126309 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3841483-2af9-40d8-8197-531d8dd1e57f-kube-api-access-8mj2v" (OuterVolumeSpecName: "kube-api-access-8mj2v") pod "e3841483-2af9-40d8-8197-531d8dd1e57f" (UID: "e3841483-2af9-40d8-8197-531d8dd1e57f"). InnerVolumeSpecName "kube-api-access-8mj2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.145021 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3841483-2af9-40d8-8197-531d8dd1e57f" (UID: "e3841483-2af9-40d8-8197-531d8dd1e57f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.174817 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-config-data" (OuterVolumeSpecName: "config-data") pod "e3841483-2af9-40d8-8197-531d8dd1e57f" (UID: "e3841483-2af9-40d8-8197-531d8dd1e57f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.222850 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.222886 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.222901 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mj2v\" (UniqueName: \"kubernetes.io/projected/e3841483-2af9-40d8-8197-531d8dd1e57f-kube-api-access-8mj2v\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.222915 4753 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.222926 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3841483-2af9-40d8-8197-531d8dd1e57f-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.640410 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tnf7s" event={"ID":"e3841483-2af9-40d8-8197-531d8dd1e57f","Type":"ContainerDied","Data":"0f14a13b63704478d02f231e152e69837d90cafb0b089ae01ef904d6a520f7af"} Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.640483 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f14a13b63704478d02f231e152e69837d90cafb0b089ae01ef904d6a520f7af" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.640497 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tnf7s" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.735617 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-85d68b7848-bh92h" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.909310 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 05 20:31:55 crc kubenswrapper[4753]: E1005 20:31:55.909662 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3841483-2af9-40d8-8197-531d8dd1e57f" containerName="cinder-db-sync" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.909680 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3841483-2af9-40d8-8197-531d8dd1e57f" containerName="cinder-db-sync" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.909866 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3841483-2af9-40d8-8197-531d8dd1e57f" containerName="cinder-db-sync" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.910686 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.913740 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-92d8j" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.919187 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.920666 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.922466 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 05 20:31:55 crc kubenswrapper[4753]: I1005 20:31:55.942530 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.024489 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cfb658c9c-tbrg7"] Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.035510 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.044304 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cfb658c9c-tbrg7"] Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.045896 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-scripts\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.045975 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.046033 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr5g5\" (UniqueName: \"kubernetes.io/projected/c6a9b992-43bc-4727-a6a6-f4280c356ebb-kube-api-access-wr5g5\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.046066 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-config-data\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.046098 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c6a9b992-43bc-4727-a6a6-f4280c356ebb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.046176 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.158263 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr5g5\" (UniqueName: \"kubernetes.io/projected/c6a9b992-43bc-4727-a6a6-f4280c356ebb-kube-api-access-wr5g5\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.158989 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-config-data\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.159268 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c6a9b992-43bc-4727-a6a6-f4280c356ebb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.159369 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.159404 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-dns-svc\") pod \"dnsmasq-dns-6cfb658c9c-tbrg7\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.159433 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-config\") pod \"dnsmasq-dns-6cfb658c9c-tbrg7\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.159486 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-scripts\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.159548 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh2tm\" (UniqueName: \"kubernetes.io/projected/333d3641-348f-40c7-b935-2a37d4269cb5-kube-api-access-xh2tm\") pod \"dnsmasq-dns-6cfb658c9c-tbrg7\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.159622 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfb658c9c-tbrg7\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.159656 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.159690 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfb658c9c-tbrg7\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.160499 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c6a9b992-43bc-4727-a6a6-f4280c356ebb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.185261 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-scripts\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.187298 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.189879 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.212589 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr5g5\" (UniqueName: \"kubernetes.io/projected/c6a9b992-43bc-4727-a6a6-f4280c356ebb-kube-api-access-wr5g5\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.214873 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.236513 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-config-data\") pod \"cinder-scheduler-0\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.243956 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.244072 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.250080 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.258150 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.262309 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.263433 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh2tm\" (UniqueName: \"kubernetes.io/projected/333d3641-348f-40c7-b935-2a37d4269cb5-kube-api-access-xh2tm\") pod \"dnsmasq-dns-6cfb658c9c-tbrg7\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.263473 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfb658c9c-tbrg7\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.263498 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfb658c9c-tbrg7\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.263610 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-dns-svc\") pod \"dnsmasq-dns-6cfb658c9c-tbrg7\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.263632 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-config\") pod \"dnsmasq-dns-6cfb658c9c-tbrg7\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.265133 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-sdkcg" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.266558 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfb658c9c-tbrg7\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.269412 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-config\") pod \"dnsmasq-dns-6cfb658c9c-tbrg7\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.269594 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfb658c9c-tbrg7\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.270020 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-dns-svc\") pod \"dnsmasq-dns-6cfb658c9c-tbrg7\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.271934 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.272083 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.308225 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.312021 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh2tm\" (UniqueName: \"kubernetes.io/projected/333d3641-348f-40c7-b935-2a37d4269cb5-kube-api-access-xh2tm\") pod \"dnsmasq-dns-6cfb658c9c-tbrg7\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.336179 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b97mg" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.364807 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae2325dc-1d53-4605-84d9-c5a341d6c311-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ae2325dc-1d53-4605-84d9-c5a341d6c311\") " pod="openstack/openstackclient" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.364867 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-config-data-custom\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.364921 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txtw2\" (UniqueName: \"kubernetes.io/projected/ae2325dc-1d53-4605-84d9-c5a341d6c311-kube-api-access-txtw2\") pod \"openstackclient\" (UID: \"ae2325dc-1d53-4605-84d9-c5a341d6c311\") " pod="openstack/openstackclient" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.364956 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-config-data\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.365020 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.365093 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae2325dc-1d53-4605-84d9-c5a341d6c311-openstack-config\") pod \"openstackclient\" (UID: \"ae2325dc-1d53-4605-84d9-c5a341d6c311\") " pod="openstack/openstackclient" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.365120 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.365185 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae2325dc-1d53-4605-84d9-c5a341d6c311-openstack-config-secret\") pod \"openstackclient\" (UID: \"ae2325dc-1d53-4605-84d9-c5a341d6c311\") " pod="openstack/openstackclient" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.365216 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt92v\" (UniqueName: \"kubernetes.io/projected/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-kube-api-access-lt92v\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.365292 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-logs\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.365315 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-scripts\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.393383 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.466691 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fa63f817-249e-416d-aa21-47fe6e04180c-db-sync-config-data\") pod \"fa63f817-249e-416d-aa21-47fe6e04180c\" (UID: \"fa63f817-249e-416d-aa21-47fe6e04180c\") " Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.466763 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j74jc\" (UniqueName: \"kubernetes.io/projected/fa63f817-249e-416d-aa21-47fe6e04180c-kube-api-access-j74jc\") pod \"fa63f817-249e-416d-aa21-47fe6e04180c\" (UID: \"fa63f817-249e-416d-aa21-47fe6e04180c\") " Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.466814 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa63f817-249e-416d-aa21-47fe6e04180c-combined-ca-bundle\") pod \"fa63f817-249e-416d-aa21-47fe6e04180c\" (UID: \"fa63f817-249e-416d-aa21-47fe6e04180c\") " Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.467052 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txtw2\" (UniqueName: \"kubernetes.io/projected/ae2325dc-1d53-4605-84d9-c5a341d6c311-kube-api-access-txtw2\") pod \"openstackclient\" (UID: \"ae2325dc-1d53-4605-84d9-c5a341d6c311\") " pod="openstack/openstackclient" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.467077 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-config-data\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.467096 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.467158 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae2325dc-1d53-4605-84d9-c5a341d6c311-openstack-config\") pod \"openstackclient\" (UID: \"ae2325dc-1d53-4605-84d9-c5a341d6c311\") " pod="openstack/openstackclient" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.467178 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.467196 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae2325dc-1d53-4605-84d9-c5a341d6c311-openstack-config-secret\") pod \"openstackclient\" (UID: \"ae2325dc-1d53-4605-84d9-c5a341d6c311\") " pod="openstack/openstackclient" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.467214 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt92v\" (UniqueName: \"kubernetes.io/projected/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-kube-api-access-lt92v\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.467257 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-logs\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.467276 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-scripts\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.467302 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae2325dc-1d53-4605-84d9-c5a341d6c311-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ae2325dc-1d53-4605-84d9-c5a341d6c311\") " pod="openstack/openstackclient" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.467319 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-config-data-custom\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.467693 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.470590 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae2325dc-1d53-4605-84d9-c5a341d6c311-openstack-config\") pod \"openstackclient\" (UID: \"ae2325dc-1d53-4605-84d9-c5a341d6c311\") " pod="openstack/openstackclient" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.470794 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-logs\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.470975 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa63f817-249e-416d-aa21-47fe6e04180c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fa63f817-249e-416d-aa21-47fe6e04180c" (UID: "fa63f817-249e-416d-aa21-47fe6e04180c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.478127 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa63f817-249e-416d-aa21-47fe6e04180c-kube-api-access-j74jc" (OuterVolumeSpecName: "kube-api-access-j74jc") pod "fa63f817-249e-416d-aa21-47fe6e04180c" (UID: "fa63f817-249e-416d-aa21-47fe6e04180c"). InnerVolumeSpecName "kube-api-access-j74jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.482056 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-config-data\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.482557 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-config-data-custom\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.483445 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-scripts\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.483951 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.487610 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae2325dc-1d53-4605-84d9-c5a341d6c311-openstack-config-secret\") pod \"openstackclient\" (UID: \"ae2325dc-1d53-4605-84d9-c5a341d6c311\") " pod="openstack/openstackclient" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.488584 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae2325dc-1d53-4605-84d9-c5a341d6c311-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ae2325dc-1d53-4605-84d9-c5a341d6c311\") " pod="openstack/openstackclient" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.495781 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt92v\" (UniqueName: \"kubernetes.io/projected/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-kube-api-access-lt92v\") pod \"cinder-api-0\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.512957 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txtw2\" (UniqueName: \"kubernetes.io/projected/ae2325dc-1d53-4605-84d9-c5a341d6c311-kube-api-access-txtw2\") pod \"openstackclient\" (UID: \"ae2325dc-1d53-4605-84d9-c5a341d6c311\") " pod="openstack/openstackclient" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.532362 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa63f817-249e-416d-aa21-47fe6e04180c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa63f817-249e-416d-aa21-47fe6e04180c" (UID: "fa63f817-249e-416d-aa21-47fe6e04180c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.535470 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.568338 4753 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fa63f817-249e-416d-aa21-47fe6e04180c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.568601 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j74jc\" (UniqueName: \"kubernetes.io/projected/fa63f817-249e-416d-aa21-47fe6e04180c-kube-api-access-j74jc\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.568614 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa63f817-249e-416d-aa21-47fe6e04180c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.668099 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.695269 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.720453 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b97mg" event={"ID":"fa63f817-249e-416d-aa21-47fe6e04180c","Type":"ContainerDied","Data":"e0f161dcbf71def7995528f83abecfc245f326f7292afda86e9647772f4ffeab"} Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.720492 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0f161dcbf71def7995528f83abecfc245f326f7292afda86e9647772f4ffeab" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.720572 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b97mg" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.964520 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-c895b6649-pwwkn"] Oct 05 20:31:56 crc kubenswrapper[4753]: E1005 20:31:56.964878 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa63f817-249e-416d-aa21-47fe6e04180c" containerName="barbican-db-sync" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.964890 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa63f817-249e-416d-aa21-47fe6e04180c" containerName="barbican-db-sync" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.965082 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa63f817-249e-416d-aa21-47fe6e04180c" containerName="barbican-db-sync" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.965921 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.971475 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.973503 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.973545 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f896l" Oct 05 20:31:56 crc kubenswrapper[4753]: I1005 20:31:56.994151 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-c895b6649-pwwkn"] Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.016284 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-fc7ffbf98-q7225"] Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.033253 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.035325 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.048193 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-fc7ffbf98-q7225"] Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.079248 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cfb658c9c-tbrg7"] Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.084082 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkfff\" (UniqueName: \"kubernetes.io/projected/1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c-kube-api-access-nkfff\") pod \"barbican-keystone-listener-fc7ffbf98-q7225\" (UID: \"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c\") " pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.084149 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f08b26d-6499-4aac-93c6-7d07e4e98d47-logs\") pod \"barbican-worker-c895b6649-pwwkn\" (UID: \"4f08b26d-6499-4aac-93c6-7d07e4e98d47\") " pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.084185 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c-config-data-custom\") pod \"barbican-keystone-listener-fc7ffbf98-q7225\" (UID: \"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c\") " pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.084210 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpb2d\" (UniqueName: \"kubernetes.io/projected/4f08b26d-6499-4aac-93c6-7d07e4e98d47-kube-api-access-jpb2d\") pod \"barbican-worker-c895b6649-pwwkn\" (UID: \"4f08b26d-6499-4aac-93c6-7d07e4e98d47\") " pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.084253 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f08b26d-6499-4aac-93c6-7d07e4e98d47-config-data\") pod \"barbican-worker-c895b6649-pwwkn\" (UID: \"4f08b26d-6499-4aac-93c6-7d07e4e98d47\") " pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.084273 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c-logs\") pod \"barbican-keystone-listener-fc7ffbf98-q7225\" (UID: \"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c\") " pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.084308 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c-combined-ca-bundle\") pod \"barbican-keystone-listener-fc7ffbf98-q7225\" (UID: \"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c\") " pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.084325 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f08b26d-6499-4aac-93c6-7d07e4e98d47-combined-ca-bundle\") pod \"barbican-worker-c895b6649-pwwkn\" (UID: \"4f08b26d-6499-4aac-93c6-7d07e4e98d47\") " pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.084339 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c-config-data\") pod \"barbican-keystone-listener-fc7ffbf98-q7225\" (UID: \"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c\") " pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.084384 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f08b26d-6499-4aac-93c6-7d07e4e98d47-config-data-custom\") pod \"barbican-worker-c895b6649-pwwkn\" (UID: \"4f08b26d-6499-4aac-93c6-7d07e4e98d47\") " pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.100875 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfb658c9c-tbrg7"] Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.148332 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bb644bd75-f6958"] Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.150186 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.169348 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bb644bd75-f6958"] Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.187484 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkfff\" (UniqueName: \"kubernetes.io/projected/1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c-kube-api-access-nkfff\") pod \"barbican-keystone-listener-fc7ffbf98-q7225\" (UID: \"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c\") " pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.187541 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f08b26d-6499-4aac-93c6-7d07e4e98d47-logs\") pod \"barbican-worker-c895b6649-pwwkn\" (UID: \"4f08b26d-6499-4aac-93c6-7d07e4e98d47\") " pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.187577 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c-config-data-custom\") pod \"barbican-keystone-listener-fc7ffbf98-q7225\" (UID: \"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c\") " pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.187606 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpb2d\" (UniqueName: \"kubernetes.io/projected/4f08b26d-6499-4aac-93c6-7d07e4e98d47-kube-api-access-jpb2d\") pod \"barbican-worker-c895b6649-pwwkn\" (UID: \"4f08b26d-6499-4aac-93c6-7d07e4e98d47\") " pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.187651 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f08b26d-6499-4aac-93c6-7d07e4e98d47-config-data\") pod \"barbican-worker-c895b6649-pwwkn\" (UID: \"4f08b26d-6499-4aac-93c6-7d07e4e98d47\") " pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.187670 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c-logs\") pod \"barbican-keystone-listener-fc7ffbf98-q7225\" (UID: \"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c\") " pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.187703 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f08b26d-6499-4aac-93c6-7d07e4e98d47-combined-ca-bundle\") pod \"barbican-worker-c895b6649-pwwkn\" (UID: \"4f08b26d-6499-4aac-93c6-7d07e4e98d47\") " pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.187720 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c-config-data\") pod \"barbican-keystone-listener-fc7ffbf98-q7225\" (UID: \"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c\") " pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.187739 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c-combined-ca-bundle\") pod \"barbican-keystone-listener-fc7ffbf98-q7225\" (UID: \"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c\") " pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.187766 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f08b26d-6499-4aac-93c6-7d07e4e98d47-config-data-custom\") pod \"barbican-worker-c895b6649-pwwkn\" (UID: \"4f08b26d-6499-4aac-93c6-7d07e4e98d47\") " pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.198507 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-8455698448-49hhr"] Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.199757 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.201014 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f08b26d-6499-4aac-93c6-7d07e4e98d47-logs\") pod \"barbican-worker-c895b6649-pwwkn\" (UID: \"4f08b26d-6499-4aac-93c6-7d07e4e98d47\") " pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.217527 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c-combined-ca-bundle\") pod \"barbican-keystone-listener-fc7ffbf98-q7225\" (UID: \"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c\") " pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.224947 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c-config-data-custom\") pod \"barbican-keystone-listener-fc7ffbf98-q7225\" (UID: \"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c\") " pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.225762 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.226098 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c-logs\") pod \"barbican-keystone-listener-fc7ffbf98-q7225\" (UID: \"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c\") " pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.237727 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f08b26d-6499-4aac-93c6-7d07e4e98d47-config-data-custom\") pod \"barbican-worker-c895b6649-pwwkn\" (UID: \"4f08b26d-6499-4aac-93c6-7d07e4e98d47\") " pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.242171 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f08b26d-6499-4aac-93c6-7d07e4e98d47-config-data\") pod \"barbican-worker-c895b6649-pwwkn\" (UID: \"4f08b26d-6499-4aac-93c6-7d07e4e98d47\") " pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.243055 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f08b26d-6499-4aac-93c6-7d07e4e98d47-combined-ca-bundle\") pod \"barbican-worker-c895b6649-pwwkn\" (UID: \"4f08b26d-6499-4aac-93c6-7d07e4e98d47\") " pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.245093 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c-config-data\") pod \"barbican-keystone-listener-fc7ffbf98-q7225\" (UID: \"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c\") " pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.251854 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkfff\" (UniqueName: \"kubernetes.io/projected/1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c-kube-api-access-nkfff\") pod \"barbican-keystone-listener-fc7ffbf98-q7225\" (UID: \"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c\") " pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.260406 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8455698448-49hhr"] Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.303877 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb644bd75-f6958\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.304102 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-config\") pod \"dnsmasq-dns-5bb644bd75-f6958\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.304131 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-dns-svc\") pod \"dnsmasq-dns-5bb644bd75-f6958\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.304180 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-config-data-custom\") pod \"barbican-api-8455698448-49hhr\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.304196 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-combined-ca-bundle\") pod \"barbican-api-8455698448-49hhr\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.304224 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb644bd75-f6958\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.304266 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f0c72a4-2cb1-44b2-af49-fe848f359173-logs\") pod \"barbican-api-8455698448-49hhr\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.304294 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-config-data\") pod \"barbican-api-8455698448-49hhr\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.304330 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq94s\" (UniqueName: \"kubernetes.io/projected/3f0c72a4-2cb1-44b2-af49-fe848f359173-kube-api-access-wq94s\") pod \"barbican-api-8455698448-49hhr\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.304348 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg79m\" (UniqueName: \"kubernetes.io/projected/2de204c7-6e5d-4369-abaf-139ec0d2edcb-kube-api-access-mg79m\") pod \"dnsmasq-dns-5bb644bd75-f6958\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.304892 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpb2d\" (UniqueName: \"kubernetes.io/projected/4f08b26d-6499-4aac-93c6-7d07e4e98d47-kube-api-access-jpb2d\") pod \"barbican-worker-c895b6649-pwwkn\" (UID: \"4f08b26d-6499-4aac-93c6-7d07e4e98d47\") " pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.325750 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.361873 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.423039 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-dns-svc\") pod \"dnsmasq-dns-5bb644bd75-f6958\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.423129 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-config-data-custom\") pod \"barbican-api-8455698448-49hhr\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.423164 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-combined-ca-bundle\") pod \"barbican-api-8455698448-49hhr\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.423351 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb644bd75-f6958\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.423453 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f0c72a4-2cb1-44b2-af49-fe848f359173-logs\") pod \"barbican-api-8455698448-49hhr\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.423509 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-config-data\") pod \"barbican-api-8455698448-49hhr\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.423565 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq94s\" (UniqueName: \"kubernetes.io/projected/3f0c72a4-2cb1-44b2-af49-fe848f359173-kube-api-access-wq94s\") pod \"barbican-api-8455698448-49hhr\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.423582 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg79m\" (UniqueName: \"kubernetes.io/projected/2de204c7-6e5d-4369-abaf-139ec0d2edcb-kube-api-access-mg79m\") pod \"dnsmasq-dns-5bb644bd75-f6958\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.423627 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb644bd75-f6958\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.423658 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-config\") pod \"dnsmasq-dns-5bb644bd75-f6958\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.426756 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-config\") pod \"dnsmasq-dns-5bb644bd75-f6958\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.426824 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb644bd75-f6958\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.427269 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb644bd75-f6958\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.427327 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f0c72a4-2cb1-44b2-af49-fe848f359173-logs\") pod \"barbican-api-8455698448-49hhr\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.437164 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-combined-ca-bundle\") pod \"barbican-api-8455698448-49hhr\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.442158 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-dns-svc\") pod \"dnsmasq-dns-5bb644bd75-f6958\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.448172 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-config-data-custom\") pod \"barbican-api-8455698448-49hhr\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.460013 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-config-data\") pod \"barbican-api-8455698448-49hhr\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.461951 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg79m\" (UniqueName: \"kubernetes.io/projected/2de204c7-6e5d-4369-abaf-139ec0d2edcb-kube-api-access-mg79m\") pod \"dnsmasq-dns-5bb644bd75-f6958\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.465384 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq94s\" (UniqueName: \"kubernetes.io/projected/3f0c72a4-2cb1-44b2-af49-fe848f359173-kube-api-access-wq94s\") pod \"barbican-api-8455698448-49hhr\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.486061 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.595945 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-c895b6649-pwwkn" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.668378 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.759074 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.810371 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" event={"ID":"333d3641-348f-40c7-b935-2a37d4269cb5","Type":"ContainerStarted","Data":"ac06d97b9eb76c260141a546c831dcd8214434fc2c19be99689008941a6ca94d"} Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.813501 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c6a9b992-43bc-4727-a6a6-f4280c356ebb","Type":"ContainerStarted","Data":"1fde8540ec9f4a22aa4e7d9b3a084a3498970c3e510fc1a9b56a524126569393"} Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.814999 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ae2325dc-1d53-4605-84d9-c5a341d6c311","Type":"ContainerStarted","Data":"491a1c9321e2d095a9d1263a5040367a95db02a4828d7934f869a3718a4ec3c6"} Oct 05 20:31:57 crc kubenswrapper[4753]: I1005 20:31:57.864308 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 05 20:31:58 crc kubenswrapper[4753]: I1005 20:31:58.072327 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-fc7ffbf98-q7225"] Oct 05 20:31:58 crc kubenswrapper[4753]: W1005 20:31:58.086266 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d4e3a5d_5755_47cc_82af_1e28c4d1cb9c.slice/crio-d55d51a659500ec93a0033c92a4359d787794cc28012ba26ae95c52de0c692df WatchSource:0}: Error finding container d55d51a659500ec93a0033c92a4359d787794cc28012ba26ae95c52de0c692df: Status 404 returned error can't find the container with id d55d51a659500ec93a0033c92a4359d787794cc28012ba26ae95c52de0c692df Oct 05 20:31:58 crc kubenswrapper[4753]: I1005 20:31:58.270508 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-c895b6649-pwwkn"] Oct 05 20:31:58 crc kubenswrapper[4753]: I1005 20:31:58.306014 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8455698448-49hhr"] Oct 05 20:31:58 crc kubenswrapper[4753]: I1005 20:31:58.546644 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bb644bd75-f6958"] Oct 05 20:31:58 crc kubenswrapper[4753]: W1005 20:31:58.652165 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f08b26d_6499_4aac_93c6_7d07e4e98d47.slice/crio-38fb500c4237cb6f24a19ddb748085aeb0c6484fd625c0c72159978a477f1357 WatchSource:0}: Error finding container 38fb500c4237cb6f24a19ddb748085aeb0c6484fd625c0c72159978a477f1357: Status 404 returned error can't find the container with id 38fb500c4237cb6f24a19ddb748085aeb0c6484fd625c0c72159978a477f1357 Oct 05 20:31:58 crc kubenswrapper[4753]: I1005 20:31:58.845647 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a","Type":"ContainerStarted","Data":"2df06b42e3eafbce6dfc9a8f2c12841f580ceba5a89086c14b309aff4d48dd8e"} Oct 05 20:31:58 crc kubenswrapper[4753]: I1005 20:31:58.848231 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 05 20:31:58 crc kubenswrapper[4753]: I1005 20:31:58.868606 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb644bd75-f6958" event={"ID":"2de204c7-6e5d-4369-abaf-139ec0d2edcb","Type":"ContainerStarted","Data":"b003bf45b4713f12c43fd3abd531570474ef3b288d9ffeb4e8a3e924684029df"} Oct 05 20:31:58 crc kubenswrapper[4753]: I1005 20:31:58.889673 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8455698448-49hhr" event={"ID":"3f0c72a4-2cb1-44b2-af49-fe848f359173","Type":"ContainerStarted","Data":"3952eb0b7b892b75e28c3e5ef385c09aec82a4cad4e28ed3e42cd4390e866324"} Oct 05 20:31:58 crc kubenswrapper[4753]: I1005 20:31:58.893595 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c895b6649-pwwkn" event={"ID":"4f08b26d-6499-4aac-93c6-7d07e4e98d47","Type":"ContainerStarted","Data":"38fb500c4237cb6f24a19ddb748085aeb0c6484fd625c0c72159978a477f1357"} Oct 05 20:31:58 crc kubenswrapper[4753]: I1005 20:31:58.915488 4753 generic.go:334] "Generic (PLEG): container finished" podID="333d3641-348f-40c7-b935-2a37d4269cb5" containerID="c7ee155e58d3015ffd9fffdada3ca1496c84f193a9b51395805d675c0606bfcf" exitCode=0 Oct 05 20:31:58 crc kubenswrapper[4753]: I1005 20:31:58.915552 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" event={"ID":"333d3641-348f-40c7-b935-2a37d4269cb5","Type":"ContainerDied","Data":"c7ee155e58d3015ffd9fffdada3ca1496c84f193a9b51395805d675c0606bfcf"} Oct 05 20:31:58 crc kubenswrapper[4753]: I1005 20:31:58.922592 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" event={"ID":"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c","Type":"ContainerStarted","Data":"d55d51a659500ec93a0033c92a4359d787794cc28012ba26ae95c52de0c692df"} Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.656456 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.807425 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-ovsdbserver-nb\") pod \"333d3641-348f-40c7-b935-2a37d4269cb5\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.807751 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-config\") pod \"333d3641-348f-40c7-b935-2a37d4269cb5\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.807769 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh2tm\" (UniqueName: \"kubernetes.io/projected/333d3641-348f-40c7-b935-2a37d4269cb5-kube-api-access-xh2tm\") pod \"333d3641-348f-40c7-b935-2a37d4269cb5\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.807844 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-ovsdbserver-sb\") pod \"333d3641-348f-40c7-b935-2a37d4269cb5\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.807989 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-dns-svc\") pod \"333d3641-348f-40c7-b935-2a37d4269cb5\" (UID: \"333d3641-348f-40c7-b935-2a37d4269cb5\") " Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.815317 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/333d3641-348f-40c7-b935-2a37d4269cb5-kube-api-access-xh2tm" (OuterVolumeSpecName: "kube-api-access-xh2tm") pod "333d3641-348f-40c7-b935-2a37d4269cb5" (UID: "333d3641-348f-40c7-b935-2a37d4269cb5"). InnerVolumeSpecName "kube-api-access-xh2tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.841338 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "333d3641-348f-40c7-b935-2a37d4269cb5" (UID: "333d3641-348f-40c7-b935-2a37d4269cb5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.846573 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-config" (OuterVolumeSpecName: "config") pod "333d3641-348f-40c7-b935-2a37d4269cb5" (UID: "333d3641-348f-40c7-b935-2a37d4269cb5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.848340 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "333d3641-348f-40c7-b935-2a37d4269cb5" (UID: "333d3641-348f-40c7-b935-2a37d4269cb5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.852990 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "333d3641-348f-40c7-b935-2a37d4269cb5" (UID: "333d3641-348f-40c7-b935-2a37d4269cb5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.911275 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.911304 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.911315 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.911324 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh2tm\" (UniqueName: \"kubernetes.io/projected/333d3641-348f-40c7-b935-2a37d4269cb5-kube-api-access-xh2tm\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.911333 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/333d3641-348f-40c7-b935-2a37d4269cb5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.938345 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a","Type":"ContainerStarted","Data":"77725fb0fb2cf7ac2bc52b40634d9dcac52ef7fb0bf45720fbd037750a395a8d"} Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.944330 4753 generic.go:334] "Generic (PLEG): container finished" podID="2de204c7-6e5d-4369-abaf-139ec0d2edcb" containerID="c0075795fadfaeabc620ae48fbec5fe8534a9f75d7f97f3d13bbb9585d9a2531" exitCode=0 Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.944569 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb644bd75-f6958" event={"ID":"2de204c7-6e5d-4369-abaf-139ec0d2edcb","Type":"ContainerDied","Data":"c0075795fadfaeabc620ae48fbec5fe8534a9f75d7f97f3d13bbb9585d9a2531"} Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.957366 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8455698448-49hhr" event={"ID":"3f0c72a4-2cb1-44b2-af49-fe848f359173","Type":"ContainerStarted","Data":"fcf652788b5e95e4ccb28fe3d3cc7180e1d84b9464809d24dc512cc048955f05"} Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.957433 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8455698448-49hhr" event={"ID":"3f0c72a4-2cb1-44b2-af49-fe848f359173","Type":"ContainerStarted","Data":"4cc0e25e397e5bea43cd9883b34ccd756d781f0bddb78c949b1e14d8d9b2ac8f"} Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.958253 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.958282 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.962225 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" event={"ID":"333d3641-348f-40c7-b935-2a37d4269cb5","Type":"ContainerDied","Data":"ac06d97b9eb76c260141a546c831dcd8214434fc2c19be99689008941a6ca94d"} Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.962270 4753 scope.go:117] "RemoveContainer" containerID="c7ee155e58d3015ffd9fffdada3ca1496c84f193a9b51395805d675c0606bfcf" Oct 05 20:31:59 crc kubenswrapper[4753]: I1005 20:31:59.962414 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfb658c9c-tbrg7" Oct 05 20:32:00 crc kubenswrapper[4753]: I1005 20:32:00.053203 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-8455698448-49hhr" podStartSLOduration=3.053183933 podStartE2EDuration="3.053183933s" podCreationTimestamp="2025-10-05 20:31:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:32:00.011258632 +0000 UTC m=+1028.859586864" watchObservedRunningTime="2025-10-05 20:32:00.053183933 +0000 UTC m=+1028.901512165" Oct 05 20:32:00 crc kubenswrapper[4753]: I1005 20:32:00.171218 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfb658c9c-tbrg7"] Oct 05 20:32:00 crc kubenswrapper[4753]: I1005 20:32:00.195851 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cfb658c9c-tbrg7"] Oct 05 20:32:00 crc kubenswrapper[4753]: I1005 20:32:00.979609 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb644bd75-f6958" event={"ID":"2de204c7-6e5d-4369-abaf-139ec0d2edcb","Type":"ContainerStarted","Data":"0ce7dac14ce3c0fb4bc4337c1689abe41e7ddea7d1244b11d052605e935a8ee6"} Oct 05 20:32:00 crc kubenswrapper[4753]: I1005 20:32:00.983588 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:32:00 crc kubenswrapper[4753]: I1005 20:32:00.990473 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c6a9b992-43bc-4727-a6a6-f4280c356ebb","Type":"ContainerStarted","Data":"6c429f1d58b05d5cbe0489b8d150c2783d949106e3af44ee9ac35bb2234ce719"} Oct 05 20:32:00 crc kubenswrapper[4753]: I1005 20:32:00.990524 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c6a9b992-43bc-4727-a6a6-f4280c356ebb","Type":"ContainerStarted","Data":"bd053a9c699f6bb9c73a8e5ad3581056fd480ba9e61ee35780dc5e8b553f7f7f"} Oct 05 20:32:01 crc kubenswrapper[4753]: I1005 20:32:01.009806 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a","Type":"ContainerStarted","Data":"0c02f0715a73a774577278f2cf9af51c08069081e6967bf3ddd0ce1be9265003"} Oct 05 20:32:01 crc kubenswrapper[4753]: I1005 20:32:01.009860 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" containerName="cinder-api-log" containerID="cri-o://77725fb0fb2cf7ac2bc52b40634d9dcac52ef7fb0bf45720fbd037750a395a8d" gracePeriod=30 Oct 05 20:32:01 crc kubenswrapper[4753]: I1005 20:32:01.009881 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 05 20:32:01 crc kubenswrapper[4753]: I1005 20:32:01.009914 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" containerName="cinder-api" containerID="cri-o://0c02f0715a73a774577278f2cf9af51c08069081e6967bf3ddd0ce1be9265003" gracePeriod=30 Oct 05 20:32:01 crc kubenswrapper[4753]: I1005 20:32:01.013458 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bb644bd75-f6958" podStartSLOduration=4.01343734 podStartE2EDuration="4.01343734s" podCreationTimestamp="2025-10-05 20:31:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:32:01.008693243 +0000 UTC m=+1029.857021475" watchObservedRunningTime="2025-10-05 20:32:01.01343734 +0000 UTC m=+1029.861765582" Oct 05 20:32:01 crc kubenswrapper[4753]: I1005 20:32:01.047617 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.624389042 podStartE2EDuration="6.047600008s" podCreationTimestamp="2025-10-05 20:31:55 +0000 UTC" firstStartedPulling="2025-10-05 20:31:57.347329974 +0000 UTC m=+1026.195658206" lastFinishedPulling="2025-10-05 20:31:58.77054094 +0000 UTC m=+1027.618869172" observedRunningTime="2025-10-05 20:32:01.042735557 +0000 UTC m=+1029.891063789" watchObservedRunningTime="2025-10-05 20:32:01.047600008 +0000 UTC m=+1029.895928240" Oct 05 20:32:01 crc kubenswrapper[4753]: I1005 20:32:01.075852 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.075838561 podStartE2EDuration="5.075838561s" podCreationTimestamp="2025-10-05 20:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:32:01.073912611 +0000 UTC m=+1029.922240843" watchObservedRunningTime="2025-10-05 20:32:01.075838561 +0000 UTC m=+1029.924166793" Oct 05 20:32:01 crc kubenswrapper[4753]: I1005 20:32:01.536021 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 05 20:32:01 crc kubenswrapper[4753]: I1005 20:32:01.866345 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="333d3641-348f-40c7-b935-2a37d4269cb5" path="/var/lib/kubelet/pods/333d3641-348f-40c7-b935-2a37d4269cb5/volumes" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.021468 4753 generic.go:334] "Generic (PLEG): container finished" podID="a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" containerID="0c02f0715a73a774577278f2cf9af51c08069081e6967bf3ddd0ce1be9265003" exitCode=0 Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.021743 4753 generic.go:334] "Generic (PLEG): container finished" podID="a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" containerID="77725fb0fb2cf7ac2bc52b40634d9dcac52ef7fb0bf45720fbd037750a395a8d" exitCode=143 Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.021911 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a","Type":"ContainerDied","Data":"0c02f0715a73a774577278f2cf9af51c08069081e6967bf3ddd0ce1be9265003"} Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.021936 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a","Type":"ContainerDied","Data":"77725fb0fb2cf7ac2bc52b40634d9dcac52ef7fb0bf45720fbd037750a395a8d"} Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.537875 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-568f5b5b96-6t6qd"] Oct 05 20:32:02 crc kubenswrapper[4753]: E1005 20:32:02.538207 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333d3641-348f-40c7-b935-2a37d4269cb5" containerName="init" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.538224 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="333d3641-348f-40c7-b935-2a37d4269cb5" containerName="init" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.538375 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="333d3641-348f-40c7-b935-2a37d4269cb5" containerName="init" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.539259 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.543782 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.551997 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.553684 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-568f5b5b96-6t6qd"] Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.688445 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75308018-c2d6-42ef-9776-f3b861ec86ed-config-data\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.688781 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75308018-c2d6-42ef-9776-f3b861ec86ed-config-data-custom\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.688867 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75308018-c2d6-42ef-9776-f3b861ec86ed-combined-ca-bundle\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.688900 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75308018-c2d6-42ef-9776-f3b861ec86ed-internal-tls-certs\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.688965 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75308018-c2d6-42ef-9776-f3b861ec86ed-logs\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.689035 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzsxn\" (UniqueName: \"kubernetes.io/projected/75308018-c2d6-42ef-9776-f3b861ec86ed-kube-api-access-tzsxn\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.689121 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75308018-c2d6-42ef-9776-f3b861ec86ed-public-tls-certs\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.747002 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.791382 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75308018-c2d6-42ef-9776-f3b861ec86ed-config-data-custom\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.791446 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75308018-c2d6-42ef-9776-f3b861ec86ed-combined-ca-bundle\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.791472 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75308018-c2d6-42ef-9776-f3b861ec86ed-internal-tls-certs\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.791506 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75308018-c2d6-42ef-9776-f3b861ec86ed-logs\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.791547 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzsxn\" (UniqueName: \"kubernetes.io/projected/75308018-c2d6-42ef-9776-f3b861ec86ed-kube-api-access-tzsxn\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.791587 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75308018-c2d6-42ef-9776-f3b861ec86ed-public-tls-certs\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.791622 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75308018-c2d6-42ef-9776-f3b861ec86ed-config-data\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.792810 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75308018-c2d6-42ef-9776-f3b861ec86ed-logs\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.802974 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75308018-c2d6-42ef-9776-f3b861ec86ed-internal-tls-certs\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.803332 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75308018-c2d6-42ef-9776-f3b861ec86ed-combined-ca-bundle\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.804348 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75308018-c2d6-42ef-9776-f3b861ec86ed-config-data\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.804848 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75308018-c2d6-42ef-9776-f3b861ec86ed-public-tls-certs\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.814621 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75308018-c2d6-42ef-9776-f3b861ec86ed-config-data-custom\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.818795 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzsxn\" (UniqueName: \"kubernetes.io/projected/75308018-c2d6-42ef-9776-f3b861ec86ed-kube-api-access-tzsxn\") pod \"barbican-api-568f5b5b96-6t6qd\" (UID: \"75308018-c2d6-42ef-9776-f3b861ec86ed\") " pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.871567 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.893165 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-combined-ca-bundle\") pod \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.893221 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-logs\") pod \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.893268 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-config-data-custom\") pod \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.893294 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-etc-machine-id\") pod \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.893350 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-scripts\") pod \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.893372 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt92v\" (UniqueName: \"kubernetes.io/projected/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-kube-api-access-lt92v\") pod \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.893448 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-config-data\") pod \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\" (UID: \"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a\") " Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.898001 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" (UID: "a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.898847 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-logs" (OuterVolumeSpecName: "logs") pod "a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" (UID: "a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.907927 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" (UID: "a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.909012 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-kube-api-access-lt92v" (OuterVolumeSpecName: "kube-api-access-lt92v") pod "a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" (UID: "a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a"). InnerVolumeSpecName "kube-api-access-lt92v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.923551 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-scripts" (OuterVolumeSpecName: "scripts") pod "a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" (UID: "a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.980217 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" (UID: "a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.995679 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.995938 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-logs\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.996008 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.996065 4753 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.996118 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:02 crc kubenswrapper[4753]: I1005 20:32:02.996197 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt92v\" (UniqueName: \"kubernetes.io/projected/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-kube-api-access-lt92v\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.021867 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-config-data" (OuterVolumeSpecName: "config-data") pod "a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" (UID: "a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.066773 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c895b6649-pwwkn" event={"ID":"4f08b26d-6499-4aac-93c6-7d07e4e98d47","Type":"ContainerStarted","Data":"5daaa1808679c7044925b27ca63656138c72f7bdb34a4d34200b25aaaa66cad7"} Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.084221 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.090777 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a","Type":"ContainerDied","Data":"2df06b42e3eafbce6dfc9a8f2c12841f580ceba5a89086c14b309aff4d48dd8e"} Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.090859 4753 scope.go:117] "RemoveContainer" containerID="0c02f0715a73a774577278f2cf9af51c08069081e6967bf3ddd0ce1be9265003" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.097503 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.155283 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.169534 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.179661 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 05 20:32:03 crc kubenswrapper[4753]: E1005 20:32:03.180389 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" containerName="cinder-api-log" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.180407 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" containerName="cinder-api-log" Oct 05 20:32:03 crc kubenswrapper[4753]: E1005 20:32:03.180428 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" containerName="cinder-api" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.180434 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" containerName="cinder-api" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.180583 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" containerName="cinder-api" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.180596 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" containerName="cinder-api-log" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.181447 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.182316 4753 scope.go:117] "RemoveContainer" containerID="77725fb0fb2cf7ac2bc52b40634d9dcac52ef7fb0bf45720fbd037750a395a8d" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.184515 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.185062 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.185256 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.192690 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.302540 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-scripts\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.302576 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba2c30bc-68b4-4803-852e-b12fe770196d-logs\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.302624 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.302650 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba2c30bc-68b4-4803-852e-b12fe770196d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.302680 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-config-data\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.302709 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.302732 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-config-data-custom\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.302752 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.302773 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-895jm\" (UniqueName: \"kubernetes.io/projected/ba2c30bc-68b4-4803-852e-b12fe770196d-kube-api-access-895jm\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.404860 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba2c30bc-68b4-4803-852e-b12fe770196d-logs\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.404945 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.404986 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba2c30bc-68b4-4803-852e-b12fe770196d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.405028 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-config-data\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.405070 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.405100 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-config-data-custom\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.405132 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.405234 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-895jm\" (UniqueName: \"kubernetes.io/projected/ba2c30bc-68b4-4803-852e-b12fe770196d-kube-api-access-895jm\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.405283 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-scripts\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.407108 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba2c30bc-68b4-4803-852e-b12fe770196d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.407774 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba2c30bc-68b4-4803-852e-b12fe770196d-logs\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.410745 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-config-data-custom\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.441790 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.441849 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.442336 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-scripts\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.442579 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-config-data\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.445100 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-895jm\" (UniqueName: \"kubernetes.io/projected/ba2c30bc-68b4-4803-852e-b12fe770196d-kube-api-access-895jm\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.449569 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-568f5b5b96-6t6qd"] Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.450616 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba2c30bc-68b4-4803-852e-b12fe770196d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ba2c30bc-68b4-4803-852e-b12fe770196d\") " pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.501208 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 05 20:32:03 crc kubenswrapper[4753]: I1005 20:32:03.863014 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a" path="/var/lib/kubelet/pods/a5c292b4-5ee5-4f35-bd01-9cb62ec4ae2a/volumes" Oct 05 20:32:04 crc kubenswrapper[4753]: I1005 20:32:04.029671 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 05 20:32:04 crc kubenswrapper[4753]: W1005 20:32:04.041809 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba2c30bc_68b4_4803_852e_b12fe770196d.slice/crio-ebe34987e0c44e8e4c23aa82375510930cbb6d0dba6d57c1d1b68bcd42723f05 WatchSource:0}: Error finding container ebe34987e0c44e8e4c23aa82375510930cbb6d0dba6d57c1d1b68bcd42723f05: Status 404 returned error can't find the container with id ebe34987e0c44e8e4c23aa82375510930cbb6d0dba6d57c1d1b68bcd42723f05 Oct 05 20:32:04 crc kubenswrapper[4753]: I1005 20:32:04.096233 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c895b6649-pwwkn" event={"ID":"4f08b26d-6499-4aac-93c6-7d07e4e98d47","Type":"ContainerStarted","Data":"dedb30bfbea4154a48440186e57f331519846a609b3f1bad22f66636463c712b"} Oct 05 20:32:04 crc kubenswrapper[4753]: I1005 20:32:04.105913 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-568f5b5b96-6t6qd" event={"ID":"75308018-c2d6-42ef-9776-f3b861ec86ed","Type":"ContainerStarted","Data":"88c5874b652c92035d726f63ab8aa04a0c8360de0f08d4ce48965ed3ddfe5eb4"} Oct 05 20:32:04 crc kubenswrapper[4753]: I1005 20:32:04.105977 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-568f5b5b96-6t6qd" event={"ID":"75308018-c2d6-42ef-9776-f3b861ec86ed","Type":"ContainerStarted","Data":"af4f541727c21e12ccc6d0f7d8391a0b4e92778d3c2c05f626fd2b2438f0a6fc"} Oct 05 20:32:04 crc kubenswrapper[4753]: I1005 20:32:04.105991 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-568f5b5b96-6t6qd" event={"ID":"75308018-c2d6-42ef-9776-f3b861ec86ed","Type":"ContainerStarted","Data":"22e0ee3947b39fdf52818fa33372aa7dd9f2497678c61da4dda3f4a460bfec73"} Oct 05 20:32:04 crc kubenswrapper[4753]: I1005 20:32:04.106037 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:04 crc kubenswrapper[4753]: I1005 20:32:04.106079 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:04 crc kubenswrapper[4753]: I1005 20:32:04.109267 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" event={"ID":"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c","Type":"ContainerStarted","Data":"7e5cef77562e49707ca3cf6cd921ed6b962b64720d865c1fa0f0cc26a2bbbd90"} Oct 05 20:32:04 crc kubenswrapper[4753]: I1005 20:32:04.109310 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" event={"ID":"1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c","Type":"ContainerStarted","Data":"6aa50dfc14c12fb83213fa9ed36d07813b7d2396fa191838cdc669b33e657365"} Oct 05 20:32:04 crc kubenswrapper[4753]: I1005 20:32:04.122113 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-c895b6649-pwwkn" podStartSLOduration=4.117811193 podStartE2EDuration="8.122093416s" podCreationTimestamp="2025-10-05 20:31:56 +0000 UTC" firstStartedPulling="2025-10-05 20:31:58.654609928 +0000 UTC m=+1027.502938160" lastFinishedPulling="2025-10-05 20:32:02.658892161 +0000 UTC m=+1031.507220383" observedRunningTime="2025-10-05 20:32:04.120079644 +0000 UTC m=+1032.968407876" watchObservedRunningTime="2025-10-05 20:32:04.122093416 +0000 UTC m=+1032.970421648" Oct 05 20:32:04 crc kubenswrapper[4753]: I1005 20:32:04.126720 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ba2c30bc-68b4-4803-852e-b12fe770196d","Type":"ContainerStarted","Data":"ebe34987e0c44e8e4c23aa82375510930cbb6d0dba6d57c1d1b68bcd42723f05"} Oct 05 20:32:04 crc kubenswrapper[4753]: I1005 20:32:04.146815 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-fc7ffbf98-q7225" podStartSLOduration=3.491338845 podStartE2EDuration="8.146791789s" podCreationTimestamp="2025-10-05 20:31:56 +0000 UTC" firstStartedPulling="2025-10-05 20:31:58.123291073 +0000 UTC m=+1026.971619305" lastFinishedPulling="2025-10-05 20:32:02.778744017 +0000 UTC m=+1031.627072249" observedRunningTime="2025-10-05 20:32:04.139550002 +0000 UTC m=+1032.987878234" watchObservedRunningTime="2025-10-05 20:32:04.146791789 +0000 UTC m=+1032.995120021" Oct 05 20:32:04 crc kubenswrapper[4753]: I1005 20:32:04.166398 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-568f5b5b96-6t6qd" podStartSLOduration=2.16637637 podStartE2EDuration="2.16637637s" podCreationTimestamp="2025-10-05 20:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:32:04.157635758 +0000 UTC m=+1033.005964400" watchObservedRunningTime="2025-10-05 20:32:04.16637637 +0000 UTC m=+1033.014704602" Oct 05 20:32:05 crc kubenswrapper[4753]: I1005 20:32:05.141521 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ba2c30bc-68b4-4803-852e-b12fe770196d","Type":"ContainerStarted","Data":"1d97c25a22a4e20a6fccce80d28c65b5b6049d48768199b0228dbd841cec76de"} Oct 05 20:32:06 crc kubenswrapper[4753]: I1005 20:32:06.152955 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ba2c30bc-68b4-4803-852e-b12fe770196d","Type":"ContainerStarted","Data":"fad82c6a5bef567e9176d1e4b25733cea814fd7be4c24cb51f7d1da63131677e"} Oct 05 20:32:06 crc kubenswrapper[4753]: I1005 20:32:06.153368 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 05 20:32:06 crc kubenswrapper[4753]: I1005 20:32:06.172469 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.172452641 podStartE2EDuration="3.172452641s" podCreationTimestamp="2025-10-05 20:32:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:32:06.168125775 +0000 UTC m=+1035.016454007" watchObservedRunningTime="2025-10-05 20:32:06.172452641 +0000 UTC m=+1035.020780863" Oct 05 20:32:06 crc kubenswrapper[4753]: I1005 20:32:06.784419 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 05 20:32:06 crc kubenswrapper[4753]: I1005 20:32:06.835785 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 05 20:32:07 crc kubenswrapper[4753]: I1005 20:32:07.189862 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c6a9b992-43bc-4727-a6a6-f4280c356ebb" containerName="cinder-scheduler" containerID="cri-o://bd053a9c699f6bb9c73a8e5ad3581056fd480ba9e61ee35780dc5e8b553f7f7f" gracePeriod=30 Oct 05 20:32:07 crc kubenswrapper[4753]: I1005 20:32:07.190368 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c6a9b992-43bc-4727-a6a6-f4280c356ebb" containerName="probe" containerID="cri-o://6c429f1d58b05d5cbe0489b8d150c2783d949106e3af44ee9ac35bb2234ce719" gracePeriod=30 Oct 05 20:32:07 crc kubenswrapper[4753]: I1005 20:32:07.470400 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:32:07 crc kubenswrapper[4753]: I1005 20:32:07.762416 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:32:07 crc kubenswrapper[4753]: I1005 20:32:07.869123 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f69d7557-flmx9"] Oct 05 20:32:07 crc kubenswrapper[4753]: I1005 20:32:07.869347 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f69d7557-flmx9" podUID="1cc6a54b-8439-4745-8b04-98f4d28d06b4" containerName="dnsmasq-dns" containerID="cri-o://707aaa5017e0cbe44a45fd815e9789c08c47f75e9bba9299091ae1d5d88916bd" gracePeriod=10 Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.210759 4753 generic.go:334] "Generic (PLEG): container finished" podID="1cc6a54b-8439-4745-8b04-98f4d28d06b4" containerID="707aaa5017e0cbe44a45fd815e9789c08c47f75e9bba9299091ae1d5d88916bd" exitCode=0 Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.210814 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f69d7557-flmx9" event={"ID":"1cc6a54b-8439-4745-8b04-98f4d28d06b4","Type":"ContainerDied","Data":"707aaa5017e0cbe44a45fd815e9789c08c47f75e9bba9299091ae1d5d88916bd"} Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.610391 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.787719 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-config\") pod \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.787823 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-dns-svc\") pod \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.787872 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2hk5\" (UniqueName: \"kubernetes.io/projected/1cc6a54b-8439-4745-8b04-98f4d28d06b4-kube-api-access-b2hk5\") pod \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.788106 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-ovsdbserver-nb\") pod \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.788154 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-ovsdbserver-sb\") pod \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\" (UID: \"1cc6a54b-8439-4745-8b04-98f4d28d06b4\") " Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.797633 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cc6a54b-8439-4745-8b04-98f4d28d06b4-kube-api-access-b2hk5" (OuterVolumeSpecName: "kube-api-access-b2hk5") pod "1cc6a54b-8439-4745-8b04-98f4d28d06b4" (UID: "1cc6a54b-8439-4745-8b04-98f4d28d06b4"). InnerVolumeSpecName "kube-api-access-b2hk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.870465 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-config" (OuterVolumeSpecName: "config") pod "1cc6a54b-8439-4745-8b04-98f4d28d06b4" (UID: "1cc6a54b-8439-4745-8b04-98f4d28d06b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.889322 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1cc6a54b-8439-4745-8b04-98f4d28d06b4" (UID: "1cc6a54b-8439-4745-8b04-98f4d28d06b4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.896348 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.896385 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.896398 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2hk5\" (UniqueName: \"kubernetes.io/projected/1cc6a54b-8439-4745-8b04-98f4d28d06b4-kube-api-access-b2hk5\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.908973 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1cc6a54b-8439-4745-8b04-98f4d28d06b4" (UID: "1cc6a54b-8439-4745-8b04-98f4d28d06b4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.949743 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1cc6a54b-8439-4745-8b04-98f4d28d06b4" (UID: "1cc6a54b-8439-4745-8b04-98f4d28d06b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.998181 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:08 crc kubenswrapper[4753]: I1005 20:32:08.998396 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cc6a54b-8439-4745-8b04-98f4d28d06b4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:09 crc kubenswrapper[4753]: I1005 20:32:09.222047 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f69d7557-flmx9" event={"ID":"1cc6a54b-8439-4745-8b04-98f4d28d06b4","Type":"ContainerDied","Data":"fc26f612b471ffd2d7a301679979ccb83ce66d36891bca87dd5e9e0e8a8546db"} Oct 05 20:32:09 crc kubenswrapper[4753]: I1005 20:32:09.222098 4753 scope.go:117] "RemoveContainer" containerID="707aaa5017e0cbe44a45fd815e9789c08c47f75e9bba9299091ae1d5d88916bd" Oct 05 20:32:09 crc kubenswrapper[4753]: I1005 20:32:09.222099 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f69d7557-flmx9" Oct 05 20:32:09 crc kubenswrapper[4753]: I1005 20:32:09.224773 4753 generic.go:334] "Generic (PLEG): container finished" podID="c6a9b992-43bc-4727-a6a6-f4280c356ebb" containerID="6c429f1d58b05d5cbe0489b8d150c2783d949106e3af44ee9ac35bb2234ce719" exitCode=0 Oct 05 20:32:09 crc kubenswrapper[4753]: I1005 20:32:09.224808 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c6a9b992-43bc-4727-a6a6-f4280c356ebb","Type":"ContainerDied","Data":"6c429f1d58b05d5cbe0489b8d150c2783d949106e3af44ee9ac35bb2234ce719"} Oct 05 20:32:09 crc kubenswrapper[4753]: I1005 20:32:09.264027 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f69d7557-flmx9"] Oct 05 20:32:09 crc kubenswrapper[4753]: I1005 20:32:09.277488 4753 scope.go:117] "RemoveContainer" containerID="1d3c1c6b7888f801f218df3c50f8626e6c41bea6ec02aa1c20717c14baf0f0b8" Oct 05 20:32:09 crc kubenswrapper[4753]: I1005 20:32:09.286569 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f69d7557-flmx9"] Oct 05 20:32:09 crc kubenswrapper[4753]: I1005 20:32:09.861752 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cc6a54b-8439-4745-8b04-98f4d28d06b4" path="/var/lib/kubelet/pods/1cc6a54b-8439-4745-8b04-98f4d28d06b4/volumes" Oct 05 20:32:10 crc kubenswrapper[4753]: I1005 20:32:10.244852 4753 generic.go:334] "Generic (PLEG): container finished" podID="c6a9b992-43bc-4727-a6a6-f4280c356ebb" containerID="bd053a9c699f6bb9c73a8e5ad3581056fd480ba9e61ee35780dc5e8b553f7f7f" exitCode=0 Oct 05 20:32:10 crc kubenswrapper[4753]: I1005 20:32:10.244907 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c6a9b992-43bc-4727-a6a6-f4280c356ebb","Type":"ContainerDied","Data":"bd053a9c699f6bb9c73a8e5ad3581056fd480ba9e61ee35780dc5e8b553f7f7f"} Oct 05 20:32:10 crc kubenswrapper[4753]: I1005 20:32:10.438485 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:32:10 crc kubenswrapper[4753]: I1005 20:32:10.446995 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-79cfb6d465-74j5v" Oct 05 20:32:10 crc kubenswrapper[4753]: I1005 20:32:10.525945 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-866d77d954-p4qfh"] Oct 05 20:32:10 crc kubenswrapper[4753]: I1005 20:32:10.526229 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-866d77d954-p4qfh" podUID="c507e884-7868-46c1-b89a-e8ee71f3e8e1" containerName="neutron-api" containerID="cri-o://b69237393cc61b3255df8f36afafb6d480e49844867db99dd24af1a55c4a796b" gracePeriod=30 Oct 05 20:32:10 crc kubenswrapper[4753]: I1005 20:32:10.526397 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-866d77d954-p4qfh" podUID="c507e884-7868-46c1-b89a-e8ee71f3e8e1" containerName="neutron-httpd" containerID="cri-o://90f7763ad014d54f29c0606d7396bbc5c87ea44fb26910e2a8c081b7db9f1104" gracePeriod=30 Oct 05 20:32:10 crc kubenswrapper[4753]: I1005 20:32:10.608592 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.260812 4753 generic.go:334] "Generic (PLEG): container finished" podID="c507e884-7868-46c1-b89a-e8ee71f3e8e1" containerID="90f7763ad014d54f29c0606d7396bbc5c87ea44fb26910e2a8c081b7db9f1104" exitCode=0 Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.260965 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-866d77d954-p4qfh" event={"ID":"c507e884-7868-46c1-b89a-e8ee71f3e8e1","Type":"ContainerDied","Data":"90f7763ad014d54f29c0606d7396bbc5c87ea44fb26910e2a8c081b7db9f1104"} Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.432198 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-jk77b"] Oct 05 20:32:11 crc kubenswrapper[4753]: E1005 20:32:11.432507 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc6a54b-8439-4745-8b04-98f4d28d06b4" containerName="init" Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.432523 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc6a54b-8439-4745-8b04-98f4d28d06b4" containerName="init" Oct 05 20:32:11 crc kubenswrapper[4753]: E1005 20:32:11.432543 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc6a54b-8439-4745-8b04-98f4d28d06b4" containerName="dnsmasq-dns" Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.432550 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc6a54b-8439-4745-8b04-98f4d28d06b4" containerName="dnsmasq-dns" Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.432836 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc6a54b-8439-4745-8b04-98f4d28d06b4" containerName="dnsmasq-dns" Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.441736 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jk77b" Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.452040 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj92s\" (UniqueName: \"kubernetes.io/projected/959512b8-c8e2-41eb-9405-a99df49caf33-kube-api-access-pj92s\") pod \"nova-api-db-create-jk77b\" (UID: \"959512b8-c8e2-41eb-9405-a99df49caf33\") " pod="openstack/nova-api-db-create-jk77b" Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.452896 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jk77b"] Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.553455 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj92s\" (UniqueName: \"kubernetes.io/projected/959512b8-c8e2-41eb-9405-a99df49caf33-kube-api-access-pj92s\") pod \"nova-api-db-create-jk77b\" (UID: \"959512b8-c8e2-41eb-9405-a99df49caf33\") " pod="openstack/nova-api-db-create-jk77b" Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.603982 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj92s\" (UniqueName: \"kubernetes.io/projected/959512b8-c8e2-41eb-9405-a99df49caf33-kube-api-access-pj92s\") pod \"nova-api-db-create-jk77b\" (UID: \"959512b8-c8e2-41eb-9405-a99df49caf33\") " pod="openstack/nova-api-db-create-jk77b" Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.724245 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-rbfbj"] Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.725478 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rbfbj" Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.733827 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rbfbj"] Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.760551 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jk77b" Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.823947 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-4nx88"] Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.825053 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4nx88" Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.837790 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4nx88"] Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.867800 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qslsz\" (UniqueName: \"kubernetes.io/projected/9b80ea4d-52cf-4876-bd3a-436c9e65c93f-kube-api-access-qslsz\") pod \"nova-cell0-db-create-rbfbj\" (UID: \"9b80ea4d-52cf-4876-bd3a-436c9e65c93f\") " pod="openstack/nova-cell0-db-create-rbfbj" Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.969768 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7mbw\" (UniqueName: \"kubernetes.io/projected/a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc-kube-api-access-q7mbw\") pod \"nova-cell1-db-create-4nx88\" (UID: \"a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc\") " pod="openstack/nova-cell1-db-create-4nx88" Oct 05 20:32:11 crc kubenswrapper[4753]: I1005 20:32:11.970186 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qslsz\" (UniqueName: \"kubernetes.io/projected/9b80ea4d-52cf-4876-bd3a-436c9e65c93f-kube-api-access-qslsz\") pod \"nova-cell0-db-create-rbfbj\" (UID: \"9b80ea4d-52cf-4876-bd3a-436c9e65c93f\") " pod="openstack/nova-cell0-db-create-rbfbj" Oct 05 20:32:12 crc kubenswrapper[4753]: I1005 20:32:12.001931 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qslsz\" (UniqueName: \"kubernetes.io/projected/9b80ea4d-52cf-4876-bd3a-436c9e65c93f-kube-api-access-qslsz\") pod \"nova-cell0-db-create-rbfbj\" (UID: \"9b80ea4d-52cf-4876-bd3a-436c9e65c93f\") " pod="openstack/nova-cell0-db-create-rbfbj" Oct 05 20:32:12 crc kubenswrapper[4753]: I1005 20:32:12.071436 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7mbw\" (UniqueName: \"kubernetes.io/projected/a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc-kube-api-access-q7mbw\") pod \"nova-cell1-db-create-4nx88\" (UID: \"a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc\") " pod="openstack/nova-cell1-db-create-4nx88" Oct 05 20:32:12 crc kubenswrapper[4753]: I1005 20:32:12.074944 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rbfbj" Oct 05 20:32:12 crc kubenswrapper[4753]: I1005 20:32:12.088474 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7mbw\" (UniqueName: \"kubernetes.io/projected/a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc-kube-api-access-q7mbw\") pod \"nova-cell1-db-create-4nx88\" (UID: \"a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc\") " pod="openstack/nova-cell1-db-create-4nx88" Oct 05 20:32:12 crc kubenswrapper[4753]: I1005 20:32:12.169032 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4nx88" Oct 05 20:32:13 crc kubenswrapper[4753]: I1005 20:32:13.849159 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:32:13 crc kubenswrapper[4753]: I1005 20:32:13.849718 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="ceilometer-central-agent" containerID="cri-o://bc56ade412c71d62989a39d6e00e1ad36d72e3985c60278b3113e7adcd1cee5b" gracePeriod=30 Oct 05 20:32:13 crc kubenswrapper[4753]: I1005 20:32:13.850786 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="sg-core" containerID="cri-o://459a708a60fda760e18b6910009580528f842822d6bbb458960969d25dd88f45" gracePeriod=30 Oct 05 20:32:13 crc kubenswrapper[4753]: I1005 20:32:13.850958 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="proxy-httpd" containerID="cri-o://00c099a79a982ca0152e2f01be1f6f641f4d833e6463861f5fd79273026ef46f" gracePeriod=30 Oct 05 20:32:13 crc kubenswrapper[4753]: I1005 20:32:13.851014 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="ceilometer-notification-agent" containerID="cri-o://81cae57410065c718a3e174a14d772660043a6a78543a94b4ca1e47bd0938c4d" gracePeriod=30 Oct 05 20:32:13 crc kubenswrapper[4753]: I1005 20:32:13.871881 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.147:3000/\": EOF" Oct 05 20:32:14 crc kubenswrapper[4753]: I1005 20:32:14.315819 4753 generic.go:334] "Generic (PLEG): container finished" podID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerID="00c099a79a982ca0152e2f01be1f6f641f4d833e6463861f5fd79273026ef46f" exitCode=0 Oct 05 20:32:14 crc kubenswrapper[4753]: I1005 20:32:14.315853 4753 generic.go:334] "Generic (PLEG): container finished" podID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerID="459a708a60fda760e18b6910009580528f842822d6bbb458960969d25dd88f45" exitCode=2 Oct 05 20:32:14 crc kubenswrapper[4753]: I1005 20:32:14.315871 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17c5fb23-4894-4cf6-a12c-c698a36c6450","Type":"ContainerDied","Data":"00c099a79a982ca0152e2f01be1f6f641f4d833e6463861f5fd79273026ef46f"} Oct 05 20:32:14 crc kubenswrapper[4753]: I1005 20:32:14.315896 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17c5fb23-4894-4cf6-a12c-c698a36c6450","Type":"ContainerDied","Data":"459a708a60fda760e18b6910009580528f842822d6bbb458960969d25dd88f45"} Oct 05 20:32:15 crc kubenswrapper[4753]: I1005 20:32:15.331095 4753 generic.go:334] "Generic (PLEG): container finished" podID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerID="bc56ade412c71d62989a39d6e00e1ad36d72e3985c60278b3113e7adcd1cee5b" exitCode=0 Oct 05 20:32:15 crc kubenswrapper[4753]: I1005 20:32:15.331177 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17c5fb23-4894-4cf6-a12c-c698a36c6450","Type":"ContainerDied","Data":"bc56ade412c71d62989a39d6e00e1ad36d72e3985c60278b3113e7adcd1cee5b"} Oct 05 20:32:15 crc kubenswrapper[4753]: I1005 20:32:15.761049 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:16 crc kubenswrapper[4753]: I1005 20:32:16.342361 4753 generic.go:334] "Generic (PLEG): container finished" podID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerID="81cae57410065c718a3e174a14d772660043a6a78543a94b4ca1e47bd0938c4d" exitCode=0 Oct 05 20:32:16 crc kubenswrapper[4753]: I1005 20:32:16.342404 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17c5fb23-4894-4cf6-a12c-c698a36c6450","Type":"ContainerDied","Data":"81cae57410065c718a3e174a14d772660043a6a78543a94b4ca1e47bd0938c4d"} Oct 05 20:32:16 crc kubenswrapper[4753]: I1005 20:32:16.564832 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-568f5b5b96-6t6qd" Oct 05 20:32:16 crc kubenswrapper[4753]: I1005 20:32:16.639417 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8455698448-49hhr"] Oct 05 20:32:16 crc kubenswrapper[4753]: I1005 20:32:16.639885 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8455698448-49hhr" podUID="3f0c72a4-2cb1-44b2-af49-fe848f359173" containerName="barbican-api-log" containerID="cri-o://4cc0e25e397e5bea43cd9883b34ccd756d781f0bddb78c949b1e14d8d9b2ac8f" gracePeriod=30 Oct 05 20:32:16 crc kubenswrapper[4753]: I1005 20:32:16.640196 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8455698448-49hhr" podUID="3f0c72a4-2cb1-44b2-af49-fe848f359173" containerName="barbican-api" containerID="cri-o://fcf652788b5e95e4ccb28fe3d3cc7180e1d84b9464809d24dc512cc048955f05" gracePeriod=30 Oct 05 20:32:17 crc kubenswrapper[4753]: I1005 20:32:17.353987 4753 generic.go:334] "Generic (PLEG): container finished" podID="3f0c72a4-2cb1-44b2-af49-fe848f359173" containerID="4cc0e25e397e5bea43cd9883b34ccd756d781f0bddb78c949b1e14d8d9b2ac8f" exitCode=143 Oct 05 20:32:17 crc kubenswrapper[4753]: I1005 20:32:17.354025 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8455698448-49hhr" event={"ID":"3f0c72a4-2cb1-44b2-af49-fe848f359173","Type":"ContainerDied","Data":"4cc0e25e397e5bea43cd9883b34ccd756d781f0bddb78c949b1e14d8d9b2ac8f"} Oct 05 20:32:17 crc kubenswrapper[4753]: I1005 20:32:17.508438 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="ba2c30bc-68b4-4803-852e-b12fe770196d" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.157:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 05 20:32:18 crc kubenswrapper[4753]: I1005 20:32:18.240616 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 05 20:32:18 crc kubenswrapper[4753]: I1005 20:32:18.961665 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.147:3000/\": dial tcp 10.217.0.147:3000: connect: connection refused" Oct 05 20:32:20 crc kubenswrapper[4753]: I1005 20:32:20.437630 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8455698448-49hhr" podUID="3f0c72a4-2cb1-44b2-af49-fe848f359173" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:33924->10.217.0.155:9311: read: connection reset by peer" Oct 05 20:32:20 crc kubenswrapper[4753]: I1005 20:32:20.437735 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8455698448-49hhr" podUID="3f0c72a4-2cb1-44b2-af49-fe848f359173" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:33922->10.217.0.155:9311: read: connection reset by peer" Oct 05 20:32:21 crc kubenswrapper[4753]: E1005 20:32:21.241534 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:5670f9e696b19b76695bea5d4f9c46ac6494d96282f094de1243d8d7a06453b2" Oct 05 20:32:21 crc kubenswrapper[4753]: E1005 20:32:21.241883 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:5670f9e696b19b76695bea5d4f9c46ac6494d96282f094de1243d8d7a06453b2,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5dch97h64dhbdh84h68fh567h677h5fchf8h5b4h575h595h68fhb4hd4h558hdfh544h648h5c5h679h555h657h57dh679h6bhcbh675h75h558h99q,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-txtw2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(ae2325dc-1d53-4605-84d9-c5a341d6c311): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 20:32:21 crc kubenswrapper[4753]: E1005 20:32:21.245830 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="ae2325dc-1d53-4605-84d9-c5a341d6c311" Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.449819 4753 generic.go:334] "Generic (PLEG): container finished" podID="3f0c72a4-2cb1-44b2-af49-fe848f359173" containerID="fcf652788b5e95e4ccb28fe3d3cc7180e1d84b9464809d24dc512cc048955f05" exitCode=0 Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.450651 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8455698448-49hhr" event={"ID":"3f0c72a4-2cb1-44b2-af49-fe848f359173","Type":"ContainerDied","Data":"fcf652788b5e95e4ccb28fe3d3cc7180e1d84b9464809d24dc512cc048955f05"} Oct 05 20:32:21 crc kubenswrapper[4753]: E1005 20:32:21.453325 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:5670f9e696b19b76695bea5d4f9c46ac6494d96282f094de1243d8d7a06453b2\\\"\"" pod="openstack/openstackclient" podUID="ae2325dc-1d53-4605-84d9-c5a341d6c311" Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.770555 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.921292 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.955614 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.960735 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-sg-core-conf-yaml\") pod \"17c5fb23-4894-4cf6-a12c-c698a36c6450\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.960832 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17c5fb23-4894-4cf6-a12c-c698a36c6450-log-httpd\") pod \"17c5fb23-4894-4cf6-a12c-c698a36c6450\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.960867 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-combined-ca-bundle\") pod \"3f0c72a4-2cb1-44b2-af49-fe848f359173\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.960937 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq94s\" (UniqueName: \"kubernetes.io/projected/3f0c72a4-2cb1-44b2-af49-fe848f359173-kube-api-access-wq94s\") pod \"3f0c72a4-2cb1-44b2-af49-fe848f359173\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.960967 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-scripts\") pod \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.960984 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr5g5\" (UniqueName: \"kubernetes.io/projected/c6a9b992-43bc-4727-a6a6-f4280c356ebb-kube-api-access-wr5g5\") pod \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.961004 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph7hf\" (UniqueName: \"kubernetes.io/projected/17c5fb23-4894-4cf6-a12c-c698a36c6450-kube-api-access-ph7hf\") pod \"17c5fb23-4894-4cf6-a12c-c698a36c6450\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.961022 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-config-data-custom\") pod \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.961045 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-config-data\") pod \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.961100 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c6a9b992-43bc-4727-a6a6-f4280c356ebb-etc-machine-id\") pod \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.961169 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17c5fb23-4894-4cf6-a12c-c698a36c6450-run-httpd\") pod \"17c5fb23-4894-4cf6-a12c-c698a36c6450\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.961192 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-config-data\") pod \"17c5fb23-4894-4cf6-a12c-c698a36c6450\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.961214 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-scripts\") pod \"17c5fb23-4894-4cf6-a12c-c698a36c6450\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.961251 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-config-data-custom\") pod \"3f0c72a4-2cb1-44b2-af49-fe848f359173\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.961279 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-combined-ca-bundle\") pod \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\" (UID: \"c6a9b992-43bc-4727-a6a6-f4280c356ebb\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.961300 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f0c72a4-2cb1-44b2-af49-fe848f359173-logs\") pod \"3f0c72a4-2cb1-44b2-af49-fe848f359173\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.961322 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-combined-ca-bundle\") pod \"17c5fb23-4894-4cf6-a12c-c698a36c6450\" (UID: \"17c5fb23-4894-4cf6-a12c-c698a36c6450\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.961338 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-config-data\") pod \"3f0c72a4-2cb1-44b2-af49-fe848f359173\" (UID: \"3f0c72a4-2cb1-44b2-af49-fe848f359173\") " Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.968584 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6a9b992-43bc-4727-a6a6-f4280c356ebb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c6a9b992-43bc-4727-a6a6-f4280c356ebb" (UID: "c6a9b992-43bc-4727-a6a6-f4280c356ebb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.969976 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17c5fb23-4894-4cf6-a12c-c698a36c6450-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "17c5fb23-4894-4cf6-a12c-c698a36c6450" (UID: "17c5fb23-4894-4cf6-a12c-c698a36c6450"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.971408 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17c5fb23-4894-4cf6-a12c-c698a36c6450-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "17c5fb23-4894-4cf6-a12c-c698a36c6450" (UID: "17c5fb23-4894-4cf6-a12c-c698a36c6450"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.972038 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f0c72a4-2cb1-44b2-af49-fe848f359173-logs" (OuterVolumeSpecName: "logs") pod "3f0c72a4-2cb1-44b2-af49-fe848f359173" (UID: "3f0c72a4-2cb1-44b2-af49-fe848f359173"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.983131 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-scripts" (OuterVolumeSpecName: "scripts") pod "c6a9b992-43bc-4727-a6a6-f4280c356ebb" (UID: "c6a9b992-43bc-4727-a6a6-f4280c356ebb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.986939 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6a9b992-43bc-4727-a6a6-f4280c356ebb-kube-api-access-wr5g5" (OuterVolumeSpecName: "kube-api-access-wr5g5") pod "c6a9b992-43bc-4727-a6a6-f4280c356ebb" (UID: "c6a9b992-43bc-4727-a6a6-f4280c356ebb"). InnerVolumeSpecName "kube-api-access-wr5g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.988997 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3f0c72a4-2cb1-44b2-af49-fe848f359173" (UID: "3f0c72a4-2cb1-44b2-af49-fe848f359173"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:21 crc kubenswrapper[4753]: I1005 20:32:21.989165 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f0c72a4-2cb1-44b2-af49-fe848f359173-kube-api-access-wq94s" (OuterVolumeSpecName: "kube-api-access-wq94s") pod "3f0c72a4-2cb1-44b2-af49-fe848f359173" (UID: "3f0c72a4-2cb1-44b2-af49-fe848f359173"). InnerVolumeSpecName "kube-api-access-wq94s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.011717 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17c5fb23-4894-4cf6-a12c-c698a36c6450-kube-api-access-ph7hf" (OuterVolumeSpecName: "kube-api-access-ph7hf") pod "17c5fb23-4894-4cf6-a12c-c698a36c6450" (UID: "17c5fb23-4894-4cf6-a12c-c698a36c6450"). InnerVolumeSpecName "kube-api-access-ph7hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.015164 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c6a9b992-43bc-4727-a6a6-f4280c356ebb" (UID: "c6a9b992-43bc-4727-a6a6-f4280c356ebb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.029340 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-scripts" (OuterVolumeSpecName: "scripts") pod "17c5fb23-4894-4cf6-a12c-c698a36c6450" (UID: "17c5fb23-4894-4cf6-a12c-c698a36c6450"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.055068 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6a9b992-43bc-4727-a6a6-f4280c356ebb" (UID: "c6a9b992-43bc-4727-a6a6-f4280c356ebb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.057969 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "17c5fb23-4894-4cf6-a12c-c698a36c6450" (UID: "17c5fb23-4894-4cf6-a12c-c698a36c6450"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.064428 4753 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17c5fb23-4894-4cf6-a12c-c698a36c6450-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.064455 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.064465 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.064473 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.064482 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f0c72a4-2cb1-44b2-af49-fe848f359173-logs\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.064490 4753 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.064497 4753 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17c5fb23-4894-4cf6-a12c-c698a36c6450-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.064505 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq94s\" (UniqueName: \"kubernetes.io/projected/3f0c72a4-2cb1-44b2-af49-fe848f359173-kube-api-access-wq94s\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.064514 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.064522 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr5g5\" (UniqueName: \"kubernetes.io/projected/c6a9b992-43bc-4727-a6a6-f4280c356ebb-kube-api-access-wr5g5\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.064530 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph7hf\" (UniqueName: \"kubernetes.io/projected/17c5fb23-4894-4cf6-a12c-c698a36c6450-kube-api-access-ph7hf\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.064538 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.064547 4753 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c6a9b992-43bc-4727-a6a6-f4280c356ebb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.081556 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-config-data" (OuterVolumeSpecName: "config-data") pod "3f0c72a4-2cb1-44b2-af49-fe848f359173" (UID: "3f0c72a4-2cb1-44b2-af49-fe848f359173"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.083234 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f0c72a4-2cb1-44b2-af49-fe848f359173" (UID: "3f0c72a4-2cb1-44b2-af49-fe848f359173"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.115979 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17c5fb23-4894-4cf6-a12c-c698a36c6450" (UID: "17c5fb23-4894-4cf6-a12c-c698a36c6450"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.146644 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-config-data" (OuterVolumeSpecName: "config-data") pod "17c5fb23-4894-4cf6-a12c-c698a36c6450" (UID: "17c5fb23-4894-4cf6-a12c-c698a36c6450"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.166612 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.166644 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.166653 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17c5fb23-4894-4cf6-a12c-c698a36c6450-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.166661 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f0c72a4-2cb1-44b2-af49-fe848f359173-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.172314 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-config-data" (OuterVolumeSpecName: "config-data") pod "c6a9b992-43bc-4727-a6a6-f4280c356ebb" (UID: "c6a9b992-43bc-4727-a6a6-f4280c356ebb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:22 crc kubenswrapper[4753]: W1005 20:32:22.256447 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1a250cc_b19d_4d1c_acd6_aa2c21ddc3bc.slice/crio-dc85156866f63d4cb0ccc90c3500996cf0c2dfba2d89df5f619d3c83ef5e53cf WatchSource:0}: Error finding container dc85156866f63d4cb0ccc90c3500996cf0c2dfba2d89df5f619d3c83ef5e53cf: Status 404 returned error can't find the container with id dc85156866f63d4cb0ccc90c3500996cf0c2dfba2d89df5f619d3c83ef5e53cf Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.263076 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4nx88"] Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.268100 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a9b992-43bc-4727-a6a6-f4280c356ebb-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.283318 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rbfbj"] Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.354780 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jk77b"] Oct 05 20:32:22 crc kubenswrapper[4753]: W1005 20:32:22.378582 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod959512b8_c8e2_41eb_9405_a99df49caf33.slice/crio-9a0ec19f0d09b6af9b6f2482462d7cd5f518edf0f4dcab4b2cd5c33f72c53752 WatchSource:0}: Error finding container 9a0ec19f0d09b6af9b6f2482462d7cd5f518edf0f4dcab4b2cd5c33f72c53752: Status 404 returned error can't find the container with id 9a0ec19f0d09b6af9b6f2482462d7cd5f518edf0f4dcab4b2cd5c33f72c53752 Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.467464 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.470164 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c6a9b992-43bc-4727-a6a6-f4280c356ebb","Type":"ContainerDied","Data":"1fde8540ec9f4a22aa4e7d9b3a084a3498970c3e510fc1a9b56a524126569393"} Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.470228 4753 scope.go:117] "RemoveContainer" containerID="6c429f1d58b05d5cbe0489b8d150c2783d949106e3af44ee9ac35bb2234ce719" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.480381 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4nx88" event={"ID":"a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc","Type":"ContainerStarted","Data":"dc85156866f63d4cb0ccc90c3500996cf0c2dfba2d89df5f619d3c83ef5e53cf"} Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.482541 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8455698448-49hhr" event={"ID":"3f0c72a4-2cb1-44b2-af49-fe848f359173","Type":"ContainerDied","Data":"3952eb0b7b892b75e28c3e5ef385c09aec82a4cad4e28ed3e42cd4390e866324"} Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.482719 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8455698448-49hhr" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.494102 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jk77b" event={"ID":"959512b8-c8e2-41eb-9405-a99df49caf33","Type":"ContainerStarted","Data":"9a0ec19f0d09b6af9b6f2482462d7cd5f518edf0f4dcab4b2cd5c33f72c53752"} Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.495205 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rbfbj" event={"ID":"9b80ea4d-52cf-4876-bd3a-436c9e65c93f","Type":"ContainerStarted","Data":"af5122a92af5bbfbde80fc5277222dc4016f80b17098ea8ab8a9c01bca712d70"} Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.495337 4753 scope.go:117] "RemoveContainer" containerID="bd053a9c699f6bb9c73a8e5ad3581056fd480ba9e61ee35780dc5e8b553f7f7f" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.514320 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17c5fb23-4894-4cf6-a12c-c698a36c6450","Type":"ContainerDied","Data":"bd2c030521eb6a8e98d1ecd14f0bd132c109c017e6c73122e853150120213688"} Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.514419 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.537505 4753 scope.go:117] "RemoveContainer" containerID="fcf652788b5e95e4ccb28fe3d3cc7180e1d84b9464809d24dc512cc048955f05" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.539705 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.560111 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.580597 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 05 20:32:22 crc kubenswrapper[4753]: E1005 20:32:22.581022 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f0c72a4-2cb1-44b2-af49-fe848f359173" containerName="barbican-api-log" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.581040 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f0c72a4-2cb1-44b2-af49-fe848f359173" containerName="barbican-api-log" Oct 05 20:32:22 crc kubenswrapper[4753]: E1005 20:32:22.581057 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="sg-core" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.581065 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="sg-core" Oct 05 20:32:22 crc kubenswrapper[4753]: E1005 20:32:22.581073 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f0c72a4-2cb1-44b2-af49-fe848f359173" containerName="barbican-api" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.581078 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f0c72a4-2cb1-44b2-af49-fe848f359173" containerName="barbican-api" Oct 05 20:32:22 crc kubenswrapper[4753]: E1005 20:32:22.581089 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a9b992-43bc-4727-a6a6-f4280c356ebb" containerName="probe" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.581094 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a9b992-43bc-4727-a6a6-f4280c356ebb" containerName="probe" Oct 05 20:32:22 crc kubenswrapper[4753]: E1005 20:32:22.581132 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="ceilometer-central-agent" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.581150 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="ceilometer-central-agent" Oct 05 20:32:22 crc kubenswrapper[4753]: E1005 20:32:22.581160 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="ceilometer-notification-agent" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.581166 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="ceilometer-notification-agent" Oct 05 20:32:22 crc kubenswrapper[4753]: E1005 20:32:22.581183 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="proxy-httpd" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.581189 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="proxy-httpd" Oct 05 20:32:22 crc kubenswrapper[4753]: E1005 20:32:22.581199 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a9b992-43bc-4727-a6a6-f4280c356ebb" containerName="cinder-scheduler" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.581205 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a9b992-43bc-4727-a6a6-f4280c356ebb" containerName="cinder-scheduler" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.581378 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="ceilometer-central-agent" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.581395 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a9b992-43bc-4727-a6a6-f4280c356ebb" containerName="probe" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.581406 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f0c72a4-2cb1-44b2-af49-fe848f359173" containerName="barbican-api-log" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.581417 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="ceilometer-notification-agent" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.581425 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="proxy-httpd" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.581435 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" containerName="sg-core" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.581442 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a9b992-43bc-4727-a6a6-f4280c356ebb" containerName="cinder-scheduler" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.581451 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f0c72a4-2cb1-44b2-af49-fe848f359173" containerName="barbican-api" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.582379 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.590419 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.595537 4753 scope.go:117] "RemoveContainer" containerID="4cc0e25e397e5bea43cd9883b34ccd756d781f0bddb78c949b1e14d8d9b2ac8f" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.595699 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8455698448-49hhr"] Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.604837 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.614558 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-8455698448-49hhr"] Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.659543 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.669526 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.672047 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.674748 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.681330 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.681528 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.686306 4753 scope.go:117] "RemoveContainer" containerID="00c099a79a982ca0152e2f01be1f6f641f4d833e6463861f5fd79273026ef46f" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.690187 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.728635 4753 scope.go:117] "RemoveContainer" containerID="459a708a60fda760e18b6910009580528f842822d6bbb458960969d25dd88f45" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.758960 4753 scope.go:117] "RemoveContainer" containerID="81cae57410065c718a3e174a14d772660043a6a78543a94b4ca1e47bd0938c4d" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.776221 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.776262 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.776280 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-scripts\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.776301 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56bdb919-1995-4b2a-855b-4d7ece37ce4c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.776317 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56bdb919-1995-4b2a-855b-4d7ece37ce4c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.776335 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56bdb919-1995-4b2a-855b-4d7ece37ce4c-scripts\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.776365 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-run-httpd\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.776389 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7twt\" (UniqueName: \"kubernetes.io/projected/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-kube-api-access-p7twt\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.776440 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56bdb919-1995-4b2a-855b-4d7ece37ce4c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.776458 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-log-httpd\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.776473 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56bdb919-1995-4b2a-855b-4d7ece37ce4c-config-data\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.776517 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh2zg\" (UniqueName: \"kubernetes.io/projected/56bdb919-1995-4b2a-855b-4d7ece37ce4c-kube-api-access-bh2zg\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.776542 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-config-data\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.877905 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56bdb919-1995-4b2a-855b-4d7ece37ce4c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.878186 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-log-httpd\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.878313 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56bdb919-1995-4b2a-855b-4d7ece37ce4c-config-data\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.878518 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh2zg\" (UniqueName: \"kubernetes.io/projected/56bdb919-1995-4b2a-855b-4d7ece37ce4c-kube-api-access-bh2zg\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.878606 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-log-httpd\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.879128 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-config-data\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.879839 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.879979 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.880111 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-scripts\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.881349 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56bdb919-1995-4b2a-855b-4d7ece37ce4c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.881673 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56bdb919-1995-4b2a-855b-4d7ece37ce4c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.881586 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56bdb919-1995-4b2a-855b-4d7ece37ce4c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.882349 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56bdb919-1995-4b2a-855b-4d7ece37ce4c-scripts\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.882602 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-run-httpd\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.884069 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-scripts\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.884499 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.884675 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-run-httpd\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.884843 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56bdb919-1995-4b2a-855b-4d7ece37ce4c-config-data\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.885982 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56bdb919-1995-4b2a-855b-4d7ece37ce4c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.887259 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.887456 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-config-data\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.887833 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56bdb919-1995-4b2a-855b-4d7ece37ce4c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.888020 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7twt\" (UniqueName: \"kubernetes.io/projected/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-kube-api-access-p7twt\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.892009 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56bdb919-1995-4b2a-855b-4d7ece37ce4c-scripts\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.901065 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh2zg\" (UniqueName: \"kubernetes.io/projected/56bdb919-1995-4b2a-855b-4d7ece37ce4c-kube-api-access-bh2zg\") pod \"cinder-scheduler-0\" (UID: \"56bdb919-1995-4b2a-855b-4d7ece37ce4c\") " pod="openstack/cinder-scheduler-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.905861 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7twt\" (UniqueName: \"kubernetes.io/projected/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-kube-api-access-p7twt\") pod \"ceilometer-0\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " pod="openstack/ceilometer-0" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.912676 4753 scope.go:117] "RemoveContainer" containerID="bc56ade412c71d62989a39d6e00e1ad36d72e3985c60278b3113e7adcd1cee5b" Oct 05 20:32:22 crc kubenswrapper[4753]: I1005 20:32:22.913048 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 05 20:32:23 crc kubenswrapper[4753]: I1005 20:32:23.013562 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:32:23 crc kubenswrapper[4753]: I1005 20:32:23.380438 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 05 20:32:23 crc kubenswrapper[4753]: I1005 20:32:23.542536 4753 generic.go:334] "Generic (PLEG): container finished" podID="9b80ea4d-52cf-4876-bd3a-436c9e65c93f" containerID="b5791038b45dcc30ecf7e5daef87bc05e17f4d21d248cbe1e6b5b086e155a1c5" exitCode=0 Oct 05 20:32:23 crc kubenswrapper[4753]: I1005 20:32:23.543378 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rbfbj" event={"ID":"9b80ea4d-52cf-4876-bd3a-436c9e65c93f","Type":"ContainerDied","Data":"b5791038b45dcc30ecf7e5daef87bc05e17f4d21d248cbe1e6b5b086e155a1c5"} Oct 05 20:32:23 crc kubenswrapper[4753]: I1005 20:32:23.578344 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"56bdb919-1995-4b2a-855b-4d7ece37ce4c","Type":"ContainerStarted","Data":"02350fe8ba0432b94c7a33dc0d5e9439f93a2bcb051db6ecb64a72f5d0a9b4ec"} Oct 05 20:32:23 crc kubenswrapper[4753]: I1005 20:32:23.581522 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:32:23 crc kubenswrapper[4753]: I1005 20:32:23.586765 4753 generic.go:334] "Generic (PLEG): container finished" podID="a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc" containerID="9f3f526fa9e8a7d9f12fc7b2f037c218f766c9695038b539ff8561d71624116d" exitCode=0 Oct 05 20:32:23 crc kubenswrapper[4753]: I1005 20:32:23.586875 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4nx88" event={"ID":"a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc","Type":"ContainerDied","Data":"9f3f526fa9e8a7d9f12fc7b2f037c218f766c9695038b539ff8561d71624116d"} Oct 05 20:32:23 crc kubenswrapper[4753]: I1005 20:32:23.598080 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jk77b" event={"ID":"959512b8-c8e2-41eb-9405-a99df49caf33","Type":"ContainerDied","Data":"d78cf11b2a8724edb5e71180493e2d325fdec51bf36cf32c7f665541cfe3e31b"} Oct 05 20:32:23 crc kubenswrapper[4753]: I1005 20:32:23.598757 4753 generic.go:334] "Generic (PLEG): container finished" podID="959512b8-c8e2-41eb-9405-a99df49caf33" containerID="d78cf11b2a8724edb5e71180493e2d325fdec51bf36cf32c7f665541cfe3e31b" exitCode=0 Oct 05 20:32:23 crc kubenswrapper[4753]: I1005 20:32:23.867685 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17c5fb23-4894-4cf6-a12c-c698a36c6450" path="/var/lib/kubelet/pods/17c5fb23-4894-4cf6-a12c-c698a36c6450/volumes" Oct 05 20:32:23 crc kubenswrapper[4753]: I1005 20:32:23.868713 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f0c72a4-2cb1-44b2-af49-fe848f359173" path="/var/lib/kubelet/pods/3f0c72a4-2cb1-44b2-af49-fe848f359173/volumes" Oct 05 20:32:23 crc kubenswrapper[4753]: I1005 20:32:23.869300 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6a9b992-43bc-4727-a6a6-f4280c356ebb" path="/var/lib/kubelet/pods/c6a9b992-43bc-4727-a6a6-f4280c356ebb/volumes" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.594539 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.613644 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"56bdb919-1995-4b2a-855b-4d7ece37ce4c","Type":"ContainerStarted","Data":"a3816e36f7d05d8009e893706c0e0ccb6077d860372cbc48095fbf2e00f1f668"} Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.615065 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4","Type":"ContainerStarted","Data":"5a34d799b2d21e502ac4ca672ce4fa56f41398826e93d83119c983d4f19571fb"} Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.643263 4753 generic.go:334] "Generic (PLEG): container finished" podID="c507e884-7868-46c1-b89a-e8ee71f3e8e1" containerID="b69237393cc61b3255df8f36afafb6d480e49844867db99dd24af1a55c4a796b" exitCode=0 Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.643523 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-866d77d954-p4qfh" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.644159 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-866d77d954-p4qfh" event={"ID":"c507e884-7868-46c1-b89a-e8ee71f3e8e1","Type":"ContainerDied","Data":"b69237393cc61b3255df8f36afafb6d480e49844867db99dd24af1a55c4a796b"} Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.644186 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-866d77d954-p4qfh" event={"ID":"c507e884-7868-46c1-b89a-e8ee71f3e8e1","Type":"ContainerDied","Data":"6946ac6b844ab9a07adbc4751d09b754f4040dc519fe326119d59397fcd17007"} Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.644204 4753 scope.go:117] "RemoveContainer" containerID="90f7763ad014d54f29c0606d7396bbc5c87ea44fb26910e2a8c081b7db9f1104" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.756779 4753 scope.go:117] "RemoveContainer" containerID="b69237393cc61b3255df8f36afafb6d480e49844867db99dd24af1a55c4a796b" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.764732 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-combined-ca-bundle\") pod \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.764785 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mrtl\" (UniqueName: \"kubernetes.io/projected/c507e884-7868-46c1-b89a-e8ee71f3e8e1-kube-api-access-2mrtl\") pod \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.764836 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-config\") pod \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.764892 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-httpd-config\") pod \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.764920 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-ovndb-tls-certs\") pod \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\" (UID: \"c507e884-7868-46c1-b89a-e8ee71f3e8e1\") " Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.784399 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c507e884-7868-46c1-b89a-e8ee71f3e8e1" (UID: "c507e884-7868-46c1-b89a-e8ee71f3e8e1"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.788391 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c507e884-7868-46c1-b89a-e8ee71f3e8e1-kube-api-access-2mrtl" (OuterVolumeSpecName: "kube-api-access-2mrtl") pod "c507e884-7868-46c1-b89a-e8ee71f3e8e1" (UID: "c507e884-7868-46c1-b89a-e8ee71f3e8e1"). InnerVolumeSpecName "kube-api-access-2mrtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.831325 4753 scope.go:117] "RemoveContainer" containerID="90f7763ad014d54f29c0606d7396bbc5c87ea44fb26910e2a8c081b7db9f1104" Oct 05 20:32:24 crc kubenswrapper[4753]: E1005 20:32:24.832068 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90f7763ad014d54f29c0606d7396bbc5c87ea44fb26910e2a8c081b7db9f1104\": container with ID starting with 90f7763ad014d54f29c0606d7396bbc5c87ea44fb26910e2a8c081b7db9f1104 not found: ID does not exist" containerID="90f7763ad014d54f29c0606d7396bbc5c87ea44fb26910e2a8c081b7db9f1104" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.832112 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90f7763ad014d54f29c0606d7396bbc5c87ea44fb26910e2a8c081b7db9f1104"} err="failed to get container status \"90f7763ad014d54f29c0606d7396bbc5c87ea44fb26910e2a8c081b7db9f1104\": rpc error: code = NotFound desc = could not find container \"90f7763ad014d54f29c0606d7396bbc5c87ea44fb26910e2a8c081b7db9f1104\": container with ID starting with 90f7763ad014d54f29c0606d7396bbc5c87ea44fb26910e2a8c081b7db9f1104 not found: ID does not exist" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.832158 4753 scope.go:117] "RemoveContainer" containerID="b69237393cc61b3255df8f36afafb6d480e49844867db99dd24af1a55c4a796b" Oct 05 20:32:24 crc kubenswrapper[4753]: E1005 20:32:24.832536 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b69237393cc61b3255df8f36afafb6d480e49844867db99dd24af1a55c4a796b\": container with ID starting with b69237393cc61b3255df8f36afafb6d480e49844867db99dd24af1a55c4a796b not found: ID does not exist" containerID="b69237393cc61b3255df8f36afafb6d480e49844867db99dd24af1a55c4a796b" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.832557 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b69237393cc61b3255df8f36afafb6d480e49844867db99dd24af1a55c4a796b"} err="failed to get container status \"b69237393cc61b3255df8f36afafb6d480e49844867db99dd24af1a55c4a796b\": rpc error: code = NotFound desc = could not find container \"b69237393cc61b3255df8f36afafb6d480e49844867db99dd24af1a55c4a796b\": container with ID starting with b69237393cc61b3255df8f36afafb6d480e49844867db99dd24af1a55c4a796b not found: ID does not exist" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.866097 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.866173 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mrtl\" (UniqueName: \"kubernetes.io/projected/c507e884-7868-46c1-b89a-e8ee71f3e8e1-kube-api-access-2mrtl\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.876700 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c507e884-7868-46c1-b89a-e8ee71f3e8e1" (UID: "c507e884-7868-46c1-b89a-e8ee71f3e8e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.917254 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c507e884-7868-46c1-b89a-e8ee71f3e8e1" (UID: "c507e884-7868-46c1-b89a-e8ee71f3e8e1"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.922363 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-config" (OuterVolumeSpecName: "config") pod "c507e884-7868-46c1-b89a-e8ee71f3e8e1" (UID: "c507e884-7868-46c1-b89a-e8ee71f3e8e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.973266 4753 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.973295 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:24 crc kubenswrapper[4753]: I1005 20:32:24.973303 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c507e884-7868-46c1-b89a-e8ee71f3e8e1-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.028208 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-866d77d954-p4qfh"] Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.034735 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-866d77d954-p4qfh"] Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.135853 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rbfbj" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.239396 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jk77b" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.283711 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qslsz\" (UniqueName: \"kubernetes.io/projected/9b80ea4d-52cf-4876-bd3a-436c9e65c93f-kube-api-access-qslsz\") pod \"9b80ea4d-52cf-4876-bd3a-436c9e65c93f\" (UID: \"9b80ea4d-52cf-4876-bd3a-436c9e65c93f\") " Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.290525 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b80ea4d-52cf-4876-bd3a-436c9e65c93f-kube-api-access-qslsz" (OuterVolumeSpecName: "kube-api-access-qslsz") pod "9b80ea4d-52cf-4876-bd3a-436c9e65c93f" (UID: "9b80ea4d-52cf-4876-bd3a-436c9e65c93f"). InnerVolumeSpecName "kube-api-access-qslsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.385723 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj92s\" (UniqueName: \"kubernetes.io/projected/959512b8-c8e2-41eb-9405-a99df49caf33-kube-api-access-pj92s\") pod \"959512b8-c8e2-41eb-9405-a99df49caf33\" (UID: \"959512b8-c8e2-41eb-9405-a99df49caf33\") " Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.386132 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qslsz\" (UniqueName: \"kubernetes.io/projected/9b80ea4d-52cf-4876-bd3a-436c9e65c93f-kube-api-access-qslsz\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.391176 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/959512b8-c8e2-41eb-9405-a99df49caf33-kube-api-access-pj92s" (OuterVolumeSpecName: "kube-api-access-pj92s") pod "959512b8-c8e2-41eb-9405-a99df49caf33" (UID: "959512b8-c8e2-41eb-9405-a99df49caf33"). InnerVolumeSpecName "kube-api-access-pj92s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.451292 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4nx88" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.487716 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj92s\" (UniqueName: \"kubernetes.io/projected/959512b8-c8e2-41eb-9405-a99df49caf33-kube-api-access-pj92s\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.588349 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7mbw\" (UniqueName: \"kubernetes.io/projected/a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc-kube-api-access-q7mbw\") pod \"a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc\" (UID: \"a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc\") " Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.591549 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc-kube-api-access-q7mbw" (OuterVolumeSpecName: "kube-api-access-q7mbw") pod "a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc" (UID: "a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc"). InnerVolumeSpecName "kube-api-access-q7mbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.652433 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"56bdb919-1995-4b2a-855b-4d7ece37ce4c","Type":"ContainerStarted","Data":"fa160612054b0b02946f76ede45188e5a4da576be773ac174482b85b7524e076"} Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.655185 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4","Type":"ContainerStarted","Data":"c1b4561bffbec804356bb7dfe417f821e001b1734ecafdaaaa0a6694618af322"} Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.655211 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4","Type":"ContainerStarted","Data":"fb2864a14bf737b93ea0924247222121c9b43a6800e33f8020ce5da4703e9411"} Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.657489 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4nx88" event={"ID":"a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc","Type":"ContainerDied","Data":"dc85156866f63d4cb0ccc90c3500996cf0c2dfba2d89df5f619d3c83ef5e53cf"} Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.657511 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc85156866f63d4cb0ccc90c3500996cf0c2dfba2d89df5f619d3c83ef5e53cf" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.657546 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4nx88" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.661765 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jk77b" event={"ID":"959512b8-c8e2-41eb-9405-a99df49caf33","Type":"ContainerDied","Data":"9a0ec19f0d09b6af9b6f2482462d7cd5f518edf0f4dcab4b2cd5c33f72c53752"} Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.661788 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jk77b" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.661802 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a0ec19f0d09b6af9b6f2482462d7cd5f518edf0f4dcab4b2cd5c33f72c53752" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.663995 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rbfbj" event={"ID":"9b80ea4d-52cf-4876-bd3a-436c9e65c93f","Type":"ContainerDied","Data":"af5122a92af5bbfbde80fc5277222dc4016f80b17098ea8ab8a9c01bca712d70"} Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.664017 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af5122a92af5bbfbde80fc5277222dc4016f80b17098ea8ab8a9c01bca712d70" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.664065 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rbfbj" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.678718 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.678698781 podStartE2EDuration="3.678698781s" podCreationTimestamp="2025-10-05 20:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:32:25.678694351 +0000 UTC m=+1054.527022583" watchObservedRunningTime="2025-10-05 20:32:25.678698781 +0000 UTC m=+1054.527027013" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.690235 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7mbw\" (UniqueName: \"kubernetes.io/projected/a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc-kube-api-access-q7mbw\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:25 crc kubenswrapper[4753]: I1005 20:32:25.861600 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c507e884-7868-46c1-b89a-e8ee71f3e8e1" path="/var/lib/kubelet/pods/c507e884-7868-46c1-b89a-e8ee71f3e8e1/volumes" Oct 05 20:32:26 crc kubenswrapper[4753]: I1005 20:32:26.674328 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4","Type":"ContainerStarted","Data":"bcd746e8b6eccc4aa8323eda49023536e43cf4a097a02cdd5edb52f5a0dcd56d"} Oct 05 20:32:27 crc kubenswrapper[4753]: I1005 20:32:27.913866 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 05 20:32:28 crc kubenswrapper[4753]: I1005 20:32:28.241257 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:32:28 crc kubenswrapper[4753]: I1005 20:32:28.691645 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4","Type":"ContainerStarted","Data":"03676ed7176e8f2b092d6f5287d264b28585b1a12edfda853f3a5d000993a858"} Oct 05 20:32:28 crc kubenswrapper[4753]: I1005 20:32:28.692162 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="ceilometer-central-agent" containerID="cri-o://fb2864a14bf737b93ea0924247222121c9b43a6800e33f8020ce5da4703e9411" gracePeriod=30 Oct 05 20:32:28 crc kubenswrapper[4753]: I1005 20:32:28.692435 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 05 20:32:28 crc kubenswrapper[4753]: I1005 20:32:28.692506 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="proxy-httpd" containerID="cri-o://03676ed7176e8f2b092d6f5287d264b28585b1a12edfda853f3a5d000993a858" gracePeriod=30 Oct 05 20:32:28 crc kubenswrapper[4753]: I1005 20:32:28.692573 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="sg-core" containerID="cri-o://bcd746e8b6eccc4aa8323eda49023536e43cf4a097a02cdd5edb52f5a0dcd56d" gracePeriod=30 Oct 05 20:32:28 crc kubenswrapper[4753]: I1005 20:32:28.692590 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="ceilometer-notification-agent" containerID="cri-o://c1b4561bffbec804356bb7dfe417f821e001b1734ecafdaaaa0a6694618af322" gracePeriod=30 Oct 05 20:32:28 crc kubenswrapper[4753]: I1005 20:32:28.720954 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.023504996 podStartE2EDuration="6.720938861s" podCreationTimestamp="2025-10-05 20:32:22 +0000 UTC" firstStartedPulling="2025-10-05 20:32:23.582854156 +0000 UTC m=+1052.431182388" lastFinishedPulling="2025-10-05 20:32:28.280288021 +0000 UTC m=+1057.128616253" observedRunningTime="2025-10-05 20:32:28.716914946 +0000 UTC m=+1057.565243178" watchObservedRunningTime="2025-10-05 20:32:28.720938861 +0000 UTC m=+1057.569267093" Oct 05 20:32:29 crc kubenswrapper[4753]: I1005 20:32:29.703068 4753 generic.go:334] "Generic (PLEG): container finished" podID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerID="bcd746e8b6eccc4aa8323eda49023536e43cf4a097a02cdd5edb52f5a0dcd56d" exitCode=2 Oct 05 20:32:29 crc kubenswrapper[4753]: I1005 20:32:29.703096 4753 generic.go:334] "Generic (PLEG): container finished" podID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerID="c1b4561bffbec804356bb7dfe417f821e001b1734ecafdaaaa0a6694618af322" exitCode=0 Oct 05 20:32:29 crc kubenswrapper[4753]: I1005 20:32:29.703104 4753 generic.go:334] "Generic (PLEG): container finished" podID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerID="fb2864a14bf737b93ea0924247222121c9b43a6800e33f8020ce5da4703e9411" exitCode=0 Oct 05 20:32:29 crc kubenswrapper[4753]: I1005 20:32:29.703124 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4","Type":"ContainerDied","Data":"bcd746e8b6eccc4aa8323eda49023536e43cf4a097a02cdd5edb52f5a0dcd56d"} Oct 05 20:32:29 crc kubenswrapper[4753]: I1005 20:32:29.703161 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4","Type":"ContainerDied","Data":"c1b4561bffbec804356bb7dfe417f821e001b1734ecafdaaaa0a6694618af322"} Oct 05 20:32:29 crc kubenswrapper[4753]: I1005 20:32:29.703172 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4","Type":"ContainerDied","Data":"fb2864a14bf737b93ea0924247222121c9b43a6800e33f8020ce5da4703e9411"} Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.772415 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-3a64-account-create-b4hzd"] Oct 05 20:32:31 crc kubenswrapper[4753]: E1005 20:32:31.773204 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c507e884-7868-46c1-b89a-e8ee71f3e8e1" containerName="neutron-api" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.773218 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c507e884-7868-46c1-b89a-e8ee71f3e8e1" containerName="neutron-api" Oct 05 20:32:31 crc kubenswrapper[4753]: E1005 20:32:31.773244 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc" containerName="mariadb-database-create" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.773250 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc" containerName="mariadb-database-create" Oct 05 20:32:31 crc kubenswrapper[4753]: E1005 20:32:31.773264 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959512b8-c8e2-41eb-9405-a99df49caf33" containerName="mariadb-database-create" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.773272 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="959512b8-c8e2-41eb-9405-a99df49caf33" containerName="mariadb-database-create" Oct 05 20:32:31 crc kubenswrapper[4753]: E1005 20:32:31.773281 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c507e884-7868-46c1-b89a-e8ee71f3e8e1" containerName="neutron-httpd" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.773287 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c507e884-7868-46c1-b89a-e8ee71f3e8e1" containerName="neutron-httpd" Oct 05 20:32:31 crc kubenswrapper[4753]: E1005 20:32:31.773301 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b80ea4d-52cf-4876-bd3a-436c9e65c93f" containerName="mariadb-database-create" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.773307 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b80ea4d-52cf-4876-bd3a-436c9e65c93f" containerName="mariadb-database-create" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.773459 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c507e884-7868-46c1-b89a-e8ee71f3e8e1" containerName="neutron-api" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.773479 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="959512b8-c8e2-41eb-9405-a99df49caf33" containerName="mariadb-database-create" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.773491 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b80ea4d-52cf-4876-bd3a-436c9e65c93f" containerName="mariadb-database-create" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.773499 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c507e884-7868-46c1-b89a-e8ee71f3e8e1" containerName="neutron-httpd" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.773508 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc" containerName="mariadb-database-create" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.774044 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3a64-account-create-b4hzd" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.777460 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.778526 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3a64-account-create-b4hzd"] Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.804532 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkcvb\" (UniqueName: \"kubernetes.io/projected/9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4-kube-api-access-pkcvb\") pod \"nova-api-3a64-account-create-b4hzd\" (UID: \"9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4\") " pod="openstack/nova-api-3a64-account-create-b4hzd" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.906359 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkcvb\" (UniqueName: \"kubernetes.io/projected/9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4-kube-api-access-pkcvb\") pod \"nova-api-3a64-account-create-b4hzd\" (UID: \"9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4\") " pod="openstack/nova-api-3a64-account-create-b4hzd" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.924892 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkcvb\" (UniqueName: \"kubernetes.io/projected/9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4-kube-api-access-pkcvb\") pod \"nova-api-3a64-account-create-b4hzd\" (UID: \"9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4\") " pod="openstack/nova-api-3a64-account-create-b4hzd" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.956081 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-15d6-account-create-fsjpw"] Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.957054 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-15d6-account-create-fsjpw" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.961737 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 05 20:32:31 crc kubenswrapper[4753]: I1005 20:32:31.967737 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-15d6-account-create-fsjpw"] Oct 05 20:32:32 crc kubenswrapper[4753]: I1005 20:32:32.007949 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mq8t\" (UniqueName: \"kubernetes.io/projected/6d8b551b-c44d-42dc-8a3b-0fbf1df013f2-kube-api-access-2mq8t\") pod \"nova-cell0-15d6-account-create-fsjpw\" (UID: \"6d8b551b-c44d-42dc-8a3b-0fbf1df013f2\") " pod="openstack/nova-cell0-15d6-account-create-fsjpw" Oct 05 20:32:32 crc kubenswrapper[4753]: I1005 20:32:32.098279 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3a64-account-create-b4hzd" Oct 05 20:32:32 crc kubenswrapper[4753]: I1005 20:32:32.109261 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mq8t\" (UniqueName: \"kubernetes.io/projected/6d8b551b-c44d-42dc-8a3b-0fbf1df013f2-kube-api-access-2mq8t\") pod \"nova-cell0-15d6-account-create-fsjpw\" (UID: \"6d8b551b-c44d-42dc-8a3b-0fbf1df013f2\") " pod="openstack/nova-cell0-15d6-account-create-fsjpw" Oct 05 20:32:32 crc kubenswrapper[4753]: I1005 20:32:32.138455 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mq8t\" (UniqueName: \"kubernetes.io/projected/6d8b551b-c44d-42dc-8a3b-0fbf1df013f2-kube-api-access-2mq8t\") pod \"nova-cell0-15d6-account-create-fsjpw\" (UID: \"6d8b551b-c44d-42dc-8a3b-0fbf1df013f2\") " pod="openstack/nova-cell0-15d6-account-create-fsjpw" Oct 05 20:32:32 crc kubenswrapper[4753]: I1005 20:32:32.171002 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4cbd-account-create-6zmlh"] Oct 05 20:32:32 crc kubenswrapper[4753]: I1005 20:32:32.172119 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4cbd-account-create-6zmlh" Oct 05 20:32:32 crc kubenswrapper[4753]: I1005 20:32:32.175372 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 05 20:32:32 crc kubenswrapper[4753]: I1005 20:32:32.182749 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4cbd-account-create-6zmlh"] Oct 05 20:32:32 crc kubenswrapper[4753]: I1005 20:32:32.210961 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4m9q\" (UniqueName: \"kubernetes.io/projected/1b219057-d805-46f6-b393-8c05f64ff2ce-kube-api-access-c4m9q\") pod \"nova-cell1-4cbd-account-create-6zmlh\" (UID: \"1b219057-d805-46f6-b393-8c05f64ff2ce\") " pod="openstack/nova-cell1-4cbd-account-create-6zmlh" Oct 05 20:32:32 crc kubenswrapper[4753]: I1005 20:32:32.276346 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-15d6-account-create-fsjpw" Oct 05 20:32:32 crc kubenswrapper[4753]: I1005 20:32:32.312608 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4m9q\" (UniqueName: \"kubernetes.io/projected/1b219057-d805-46f6-b393-8c05f64ff2ce-kube-api-access-c4m9q\") pod \"nova-cell1-4cbd-account-create-6zmlh\" (UID: \"1b219057-d805-46f6-b393-8c05f64ff2ce\") " pod="openstack/nova-cell1-4cbd-account-create-6zmlh" Oct 05 20:32:32 crc kubenswrapper[4753]: I1005 20:32:32.337805 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4m9q\" (UniqueName: \"kubernetes.io/projected/1b219057-d805-46f6-b393-8c05f64ff2ce-kube-api-access-c4m9q\") pod \"nova-cell1-4cbd-account-create-6zmlh\" (UID: \"1b219057-d805-46f6-b393-8c05f64ff2ce\") " pod="openstack/nova-cell1-4cbd-account-create-6zmlh" Oct 05 20:32:32 crc kubenswrapper[4753]: I1005 20:32:32.516292 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4cbd-account-create-6zmlh" Oct 05 20:32:32 crc kubenswrapper[4753]: I1005 20:32:32.732206 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3a64-account-create-b4hzd"] Oct 05 20:32:32 crc kubenswrapper[4753]: I1005 20:32:32.890378 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-15d6-account-create-fsjpw"] Oct 05 20:32:32 crc kubenswrapper[4753]: W1005 20:32:32.892744 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d8b551b_c44d_42dc_8a3b_0fbf1df013f2.slice/crio-5df42783bcc0a58ce24126e7b14e7eb24de409ffe3c035daa9f972b6a0a308c0 WatchSource:0}: Error finding container 5df42783bcc0a58ce24126e7b14e7eb24de409ffe3c035daa9f972b6a0a308c0: Status 404 returned error can't find the container with id 5df42783bcc0a58ce24126e7b14e7eb24de409ffe3c035daa9f972b6a0a308c0 Oct 05 20:32:33 crc kubenswrapper[4753]: I1005 20:32:33.044831 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4cbd-account-create-6zmlh"] Oct 05 20:32:33 crc kubenswrapper[4753]: I1005 20:32:33.402839 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 05 20:32:33 crc kubenswrapper[4753]: I1005 20:32:33.745713 4753 generic.go:334] "Generic (PLEG): container finished" podID="1b219057-d805-46f6-b393-8c05f64ff2ce" containerID="c59099e2f1ac96a60533b2e0becc9d6c8dcc380b6b707d1dee0df9f1587f1077" exitCode=0 Oct 05 20:32:33 crc kubenswrapper[4753]: I1005 20:32:33.745774 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4cbd-account-create-6zmlh" event={"ID":"1b219057-d805-46f6-b393-8c05f64ff2ce","Type":"ContainerDied","Data":"c59099e2f1ac96a60533b2e0becc9d6c8dcc380b6b707d1dee0df9f1587f1077"} Oct 05 20:32:33 crc kubenswrapper[4753]: I1005 20:32:33.746103 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4cbd-account-create-6zmlh" event={"ID":"1b219057-d805-46f6-b393-8c05f64ff2ce","Type":"ContainerStarted","Data":"baf127f0554a125120d57c60bb7eb275524e3af578e1ae54e49e21f4b307d6d2"} Oct 05 20:32:33 crc kubenswrapper[4753]: I1005 20:32:33.748313 4753 generic.go:334] "Generic (PLEG): container finished" podID="6d8b551b-c44d-42dc-8a3b-0fbf1df013f2" containerID="c0027ccdaa8a764b096bd84cda96635990185c17af50476fe240fafdc87c0a4b" exitCode=0 Oct 05 20:32:33 crc kubenswrapper[4753]: I1005 20:32:33.748359 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-15d6-account-create-fsjpw" event={"ID":"6d8b551b-c44d-42dc-8a3b-0fbf1df013f2","Type":"ContainerDied","Data":"c0027ccdaa8a764b096bd84cda96635990185c17af50476fe240fafdc87c0a4b"} Oct 05 20:32:33 crc kubenswrapper[4753]: I1005 20:32:33.748375 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-15d6-account-create-fsjpw" event={"ID":"6d8b551b-c44d-42dc-8a3b-0fbf1df013f2","Type":"ContainerStarted","Data":"5df42783bcc0a58ce24126e7b14e7eb24de409ffe3c035daa9f972b6a0a308c0"} Oct 05 20:32:33 crc kubenswrapper[4753]: I1005 20:32:33.749698 4753 generic.go:334] "Generic (PLEG): container finished" podID="9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4" containerID="930021d9264c79b3a661cdaabc77f3ad8ebf959f930b845dc018bf3d126170d7" exitCode=0 Oct 05 20:32:33 crc kubenswrapper[4753]: I1005 20:32:33.749729 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3a64-account-create-b4hzd" event={"ID":"9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4","Type":"ContainerDied","Data":"930021d9264c79b3a661cdaabc77f3ad8ebf959f930b845dc018bf3d126170d7"} Oct 05 20:32:33 crc kubenswrapper[4753]: I1005 20:32:33.749745 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3a64-account-create-b4hzd" event={"ID":"9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4","Type":"ContainerStarted","Data":"4eef81959e1982a7980bd6e6eb0d053279d4574e1d840975e16f2e21b1a03339"} Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.236976 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-15d6-account-create-fsjpw" Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.244639 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3a64-account-create-b4hzd" Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.259429 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4cbd-account-create-6zmlh" Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.290920 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkcvb\" (UniqueName: \"kubernetes.io/projected/9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4-kube-api-access-pkcvb\") pod \"9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4\" (UID: \"9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4\") " Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.291106 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mq8t\" (UniqueName: \"kubernetes.io/projected/6d8b551b-c44d-42dc-8a3b-0fbf1df013f2-kube-api-access-2mq8t\") pod \"6d8b551b-c44d-42dc-8a3b-0fbf1df013f2\" (UID: \"6d8b551b-c44d-42dc-8a3b-0fbf1df013f2\") " Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.298844 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4-kube-api-access-pkcvb" (OuterVolumeSpecName: "kube-api-access-pkcvb") pod "9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4" (UID: "9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4"). InnerVolumeSpecName "kube-api-access-pkcvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.315844 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d8b551b-c44d-42dc-8a3b-0fbf1df013f2-kube-api-access-2mq8t" (OuterVolumeSpecName: "kube-api-access-2mq8t") pod "6d8b551b-c44d-42dc-8a3b-0fbf1df013f2" (UID: "6d8b551b-c44d-42dc-8a3b-0fbf1df013f2"). InnerVolumeSpecName "kube-api-access-2mq8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.393786 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4m9q\" (UniqueName: \"kubernetes.io/projected/1b219057-d805-46f6-b393-8c05f64ff2ce-kube-api-access-c4m9q\") pod \"1b219057-d805-46f6-b393-8c05f64ff2ce\" (UID: \"1b219057-d805-46f6-b393-8c05f64ff2ce\") " Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.394373 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkcvb\" (UniqueName: \"kubernetes.io/projected/9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4-kube-api-access-pkcvb\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.394393 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mq8t\" (UniqueName: \"kubernetes.io/projected/6d8b551b-c44d-42dc-8a3b-0fbf1df013f2-kube-api-access-2mq8t\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.396479 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b219057-d805-46f6-b393-8c05f64ff2ce-kube-api-access-c4m9q" (OuterVolumeSpecName: "kube-api-access-c4m9q") pod "1b219057-d805-46f6-b393-8c05f64ff2ce" (UID: "1b219057-d805-46f6-b393-8c05f64ff2ce"). InnerVolumeSpecName "kube-api-access-c4m9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.495386 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4m9q\" (UniqueName: \"kubernetes.io/projected/1b219057-d805-46f6-b393-8c05f64ff2ce-kube-api-access-c4m9q\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.767552 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4cbd-account-create-6zmlh" event={"ID":"1b219057-d805-46f6-b393-8c05f64ff2ce","Type":"ContainerDied","Data":"baf127f0554a125120d57c60bb7eb275524e3af578e1ae54e49e21f4b307d6d2"} Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.767618 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baf127f0554a125120d57c60bb7eb275524e3af578e1ae54e49e21f4b307d6d2" Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.767564 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4cbd-account-create-6zmlh" Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.774864 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-15d6-account-create-fsjpw" Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.774867 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-15d6-account-create-fsjpw" event={"ID":"6d8b551b-c44d-42dc-8a3b-0fbf1df013f2","Type":"ContainerDied","Data":"5df42783bcc0a58ce24126e7b14e7eb24de409ffe3c035daa9f972b6a0a308c0"} Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.774905 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5df42783bcc0a58ce24126e7b14e7eb24de409ffe3c035daa9f972b6a0a308c0" Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.776776 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3a64-account-create-b4hzd" event={"ID":"9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4","Type":"ContainerDied","Data":"4eef81959e1982a7980bd6e6eb0d053279d4574e1d840975e16f2e21b1a03339"} Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.776808 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4eef81959e1982a7980bd6e6eb0d053279d4574e1d840975e16f2e21b1a03339" Oct 05 20:32:35 crc kubenswrapper[4753]: I1005 20:32:35.776831 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3a64-account-create-b4hzd" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.205263 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5hvlp"] Oct 05 20:32:37 crc kubenswrapper[4753]: E1005 20:32:37.205994 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b219057-d805-46f6-b393-8c05f64ff2ce" containerName="mariadb-account-create" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.206009 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b219057-d805-46f6-b393-8c05f64ff2ce" containerName="mariadb-account-create" Oct 05 20:32:37 crc kubenswrapper[4753]: E1005 20:32:37.206028 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4" containerName="mariadb-account-create" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.206036 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4" containerName="mariadb-account-create" Oct 05 20:32:37 crc kubenswrapper[4753]: E1005 20:32:37.206066 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8b551b-c44d-42dc-8a3b-0fbf1df013f2" containerName="mariadb-account-create" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.206075 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8b551b-c44d-42dc-8a3b-0fbf1df013f2" containerName="mariadb-account-create" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.206287 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4" containerName="mariadb-account-create" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.206300 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8b551b-c44d-42dc-8a3b-0fbf1df013f2" containerName="mariadb-account-create" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.206326 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b219057-d805-46f6-b393-8c05f64ff2ce" containerName="mariadb-account-create" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.207101 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5hvlp" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.210330 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-nlzfn" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.210858 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.211071 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.219405 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5hvlp"] Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.326712 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5hvlp\" (UID: \"698096ab-42c7-4f67-87b8-27d612fd3c25\") " pod="openstack/nova-cell0-conductor-db-sync-5hvlp" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.326792 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-scripts\") pod \"nova-cell0-conductor-db-sync-5hvlp\" (UID: \"698096ab-42c7-4f67-87b8-27d612fd3c25\") " pod="openstack/nova-cell0-conductor-db-sync-5hvlp" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.326848 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlpqh\" (UniqueName: \"kubernetes.io/projected/698096ab-42c7-4f67-87b8-27d612fd3c25-kube-api-access-xlpqh\") pod \"nova-cell0-conductor-db-sync-5hvlp\" (UID: \"698096ab-42c7-4f67-87b8-27d612fd3c25\") " pod="openstack/nova-cell0-conductor-db-sync-5hvlp" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.327073 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-config-data\") pod \"nova-cell0-conductor-db-sync-5hvlp\" (UID: \"698096ab-42c7-4f67-87b8-27d612fd3c25\") " pod="openstack/nova-cell0-conductor-db-sync-5hvlp" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.428318 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlpqh\" (UniqueName: \"kubernetes.io/projected/698096ab-42c7-4f67-87b8-27d612fd3c25-kube-api-access-xlpqh\") pod \"nova-cell0-conductor-db-sync-5hvlp\" (UID: \"698096ab-42c7-4f67-87b8-27d612fd3c25\") " pod="openstack/nova-cell0-conductor-db-sync-5hvlp" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.428417 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-config-data\") pod \"nova-cell0-conductor-db-sync-5hvlp\" (UID: \"698096ab-42c7-4f67-87b8-27d612fd3c25\") " pod="openstack/nova-cell0-conductor-db-sync-5hvlp" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.428466 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5hvlp\" (UID: \"698096ab-42c7-4f67-87b8-27d612fd3c25\") " pod="openstack/nova-cell0-conductor-db-sync-5hvlp" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.428522 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-scripts\") pod \"nova-cell0-conductor-db-sync-5hvlp\" (UID: \"698096ab-42c7-4f67-87b8-27d612fd3c25\") " pod="openstack/nova-cell0-conductor-db-sync-5hvlp" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.439518 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-scripts\") pod \"nova-cell0-conductor-db-sync-5hvlp\" (UID: \"698096ab-42c7-4f67-87b8-27d612fd3c25\") " pod="openstack/nova-cell0-conductor-db-sync-5hvlp" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.444668 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5hvlp\" (UID: \"698096ab-42c7-4f67-87b8-27d612fd3c25\") " pod="openstack/nova-cell0-conductor-db-sync-5hvlp" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.447255 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-config-data\") pod \"nova-cell0-conductor-db-sync-5hvlp\" (UID: \"698096ab-42c7-4f67-87b8-27d612fd3c25\") " pod="openstack/nova-cell0-conductor-db-sync-5hvlp" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.448702 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlpqh\" (UniqueName: \"kubernetes.io/projected/698096ab-42c7-4f67-87b8-27d612fd3c25-kube-api-access-xlpqh\") pod \"nova-cell0-conductor-db-sync-5hvlp\" (UID: \"698096ab-42c7-4f67-87b8-27d612fd3c25\") " pod="openstack/nova-cell0-conductor-db-sync-5hvlp" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.522399 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5hvlp" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.807829 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ae2325dc-1d53-4605-84d9-c5a341d6c311","Type":"ContainerStarted","Data":"50569d3845c00fb1e7b31c7ca9f8e13a6212158b96b33ad3acadec16a288b596"} Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.825318 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.238303408 podStartE2EDuration="41.825301542s" podCreationTimestamp="2025-10-05 20:31:56 +0000 UTC" firstStartedPulling="2025-10-05 20:31:57.695794315 +0000 UTC m=+1026.544122547" lastFinishedPulling="2025-10-05 20:32:37.282792449 +0000 UTC m=+1066.131120681" observedRunningTime="2025-10-05 20:32:37.824074444 +0000 UTC m=+1066.672402676" watchObservedRunningTime="2025-10-05 20:32:37.825301542 +0000 UTC m=+1066.673629764" Oct 05 20:32:37 crc kubenswrapper[4753]: I1005 20:32:37.954754 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5hvlp"] Oct 05 20:32:38 crc kubenswrapper[4753]: I1005 20:32:38.825170 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5hvlp" event={"ID":"698096ab-42c7-4f67-87b8-27d612fd3c25","Type":"ContainerStarted","Data":"db26a07c03ac413eb9735d289fa453863ee5595e8f38da293e3fb8681bba954c"} Oct 05 20:32:45 crc kubenswrapper[4753]: I1005 20:32:45.917087 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5hvlp" event={"ID":"698096ab-42c7-4f67-87b8-27d612fd3c25","Type":"ContainerStarted","Data":"6494a3d3da0cdfcacd48411aca0233ba49d748583d65f344fa09d00ad812e2ff"} Oct 05 20:32:45 crc kubenswrapper[4753]: I1005 20:32:45.939717 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-5hvlp" podStartSLOduration=1.627340526 podStartE2EDuration="8.939696898s" podCreationTimestamp="2025-10-05 20:32:37 +0000 UTC" firstStartedPulling="2025-10-05 20:32:37.957479713 +0000 UTC m=+1066.805807945" lastFinishedPulling="2025-10-05 20:32:45.269836065 +0000 UTC m=+1074.118164317" observedRunningTime="2025-10-05 20:32:45.934660541 +0000 UTC m=+1074.782988773" watchObservedRunningTime="2025-10-05 20:32:45.939696898 +0000 UTC m=+1074.788025130" Oct 05 20:32:53 crc kubenswrapper[4753]: I1005 20:32:53.019868 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 05 20:32:55 crc kubenswrapper[4753]: I1005 20:32:55.011238 4753 generic.go:334] "Generic (PLEG): container finished" podID="698096ab-42c7-4f67-87b8-27d612fd3c25" containerID="6494a3d3da0cdfcacd48411aca0233ba49d748583d65f344fa09d00ad812e2ff" exitCode=0 Oct 05 20:32:55 crc kubenswrapper[4753]: I1005 20:32:55.011490 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5hvlp" event={"ID":"698096ab-42c7-4f67-87b8-27d612fd3c25","Type":"ContainerDied","Data":"6494a3d3da0cdfcacd48411aca0233ba49d748583d65f344fa09d00ad812e2ff"} Oct 05 20:32:56 crc kubenswrapper[4753]: I1005 20:32:56.438218 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5hvlp" Oct 05 20:32:56 crc kubenswrapper[4753]: I1005 20:32:56.573110 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlpqh\" (UniqueName: \"kubernetes.io/projected/698096ab-42c7-4f67-87b8-27d612fd3c25-kube-api-access-xlpqh\") pod \"698096ab-42c7-4f67-87b8-27d612fd3c25\" (UID: \"698096ab-42c7-4f67-87b8-27d612fd3c25\") " Oct 05 20:32:56 crc kubenswrapper[4753]: I1005 20:32:56.573355 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-scripts\") pod \"698096ab-42c7-4f67-87b8-27d612fd3c25\" (UID: \"698096ab-42c7-4f67-87b8-27d612fd3c25\") " Oct 05 20:32:56 crc kubenswrapper[4753]: I1005 20:32:56.573415 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-config-data\") pod \"698096ab-42c7-4f67-87b8-27d612fd3c25\" (UID: \"698096ab-42c7-4f67-87b8-27d612fd3c25\") " Oct 05 20:32:56 crc kubenswrapper[4753]: I1005 20:32:56.573454 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-combined-ca-bundle\") pod \"698096ab-42c7-4f67-87b8-27d612fd3c25\" (UID: \"698096ab-42c7-4f67-87b8-27d612fd3c25\") " Oct 05 20:32:56 crc kubenswrapper[4753]: I1005 20:32:56.578534 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-scripts" (OuterVolumeSpecName: "scripts") pod "698096ab-42c7-4f67-87b8-27d612fd3c25" (UID: "698096ab-42c7-4f67-87b8-27d612fd3c25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:56 crc kubenswrapper[4753]: I1005 20:32:56.580956 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/698096ab-42c7-4f67-87b8-27d612fd3c25-kube-api-access-xlpqh" (OuterVolumeSpecName: "kube-api-access-xlpqh") pod "698096ab-42c7-4f67-87b8-27d612fd3c25" (UID: "698096ab-42c7-4f67-87b8-27d612fd3c25"). InnerVolumeSpecName "kube-api-access-xlpqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:32:56 crc kubenswrapper[4753]: I1005 20:32:56.598305 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-config-data" (OuterVolumeSpecName: "config-data") pod "698096ab-42c7-4f67-87b8-27d612fd3c25" (UID: "698096ab-42c7-4f67-87b8-27d612fd3c25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:56 crc kubenswrapper[4753]: I1005 20:32:56.616776 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "698096ab-42c7-4f67-87b8-27d612fd3c25" (UID: "698096ab-42c7-4f67-87b8-27d612fd3c25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:56 crc kubenswrapper[4753]: I1005 20:32:56.675733 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:56 crc kubenswrapper[4753]: I1005 20:32:56.676005 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlpqh\" (UniqueName: \"kubernetes.io/projected/698096ab-42c7-4f67-87b8-27d612fd3c25-kube-api-access-xlpqh\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:56 crc kubenswrapper[4753]: I1005 20:32:56.676148 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:56 crc kubenswrapper[4753]: I1005 20:32:56.676221 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/698096ab-42c7-4f67-87b8-27d612fd3c25-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.027918 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5hvlp" event={"ID":"698096ab-42c7-4f67-87b8-27d612fd3c25","Type":"ContainerDied","Data":"db26a07c03ac413eb9735d289fa453863ee5595e8f38da293e3fb8681bba954c"} Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.027964 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db26a07c03ac413eb9735d289fa453863ee5595e8f38da293e3fb8681bba954c" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.027977 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5hvlp" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.132367 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 05 20:32:57 crc kubenswrapper[4753]: E1005 20:32:57.132838 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698096ab-42c7-4f67-87b8-27d612fd3c25" containerName="nova-cell0-conductor-db-sync" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.132860 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="698096ab-42c7-4f67-87b8-27d612fd3c25" containerName="nova-cell0-conductor-db-sync" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.133080 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="698096ab-42c7-4f67-87b8-27d612fd3c25" containerName="nova-cell0-conductor-db-sync" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.133764 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.140483 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.141776 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-nlzfn" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.148845 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.185657 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ddb5b3-36ec-421d-a5d0-f465f7cf0316-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d8ddb5b3-36ec-421d-a5d0-f465f7cf0316\") " pod="openstack/nova-cell0-conductor-0" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.185734 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bzps\" (UniqueName: \"kubernetes.io/projected/d8ddb5b3-36ec-421d-a5d0-f465f7cf0316-kube-api-access-4bzps\") pod \"nova-cell0-conductor-0\" (UID: \"d8ddb5b3-36ec-421d-a5d0-f465f7cf0316\") " pod="openstack/nova-cell0-conductor-0" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.185786 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ddb5b3-36ec-421d-a5d0-f465f7cf0316-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d8ddb5b3-36ec-421d-a5d0-f465f7cf0316\") " pod="openstack/nova-cell0-conductor-0" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.288089 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ddb5b3-36ec-421d-a5d0-f465f7cf0316-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d8ddb5b3-36ec-421d-a5d0-f465f7cf0316\") " pod="openstack/nova-cell0-conductor-0" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.288207 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ddb5b3-36ec-421d-a5d0-f465f7cf0316-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d8ddb5b3-36ec-421d-a5d0-f465f7cf0316\") " pod="openstack/nova-cell0-conductor-0" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.288275 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bzps\" (UniqueName: \"kubernetes.io/projected/d8ddb5b3-36ec-421d-a5d0-f465f7cf0316-kube-api-access-4bzps\") pod \"nova-cell0-conductor-0\" (UID: \"d8ddb5b3-36ec-421d-a5d0-f465f7cf0316\") " pod="openstack/nova-cell0-conductor-0" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.293103 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ddb5b3-36ec-421d-a5d0-f465f7cf0316-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d8ddb5b3-36ec-421d-a5d0-f465f7cf0316\") " pod="openstack/nova-cell0-conductor-0" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.293192 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ddb5b3-36ec-421d-a5d0-f465f7cf0316-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d8ddb5b3-36ec-421d-a5d0-f465f7cf0316\") " pod="openstack/nova-cell0-conductor-0" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.306720 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bzps\" (UniqueName: \"kubernetes.io/projected/d8ddb5b3-36ec-421d-a5d0-f465f7cf0316-kube-api-access-4bzps\") pod \"nova-cell0-conductor-0\" (UID: \"d8ddb5b3-36ec-421d-a5d0-f465f7cf0316\") " pod="openstack/nova-cell0-conductor-0" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.451732 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 05 20:32:57 crc kubenswrapper[4753]: I1005 20:32:57.904670 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 05 20:32:58 crc kubenswrapper[4753]: I1005 20:32:58.040130 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d8ddb5b3-36ec-421d-a5d0-f465f7cf0316","Type":"ContainerStarted","Data":"35ee7935d04a6e77d3ddaae5c592030725ae1fdf3d268f26c7878500abc5cf43"} Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.050077 4753 generic.go:334] "Generic (PLEG): container finished" podID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerID="03676ed7176e8f2b092d6f5287d264b28585b1a12edfda853f3a5d000993a858" exitCode=137 Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.050241 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4","Type":"ContainerDied","Data":"03676ed7176e8f2b092d6f5287d264b28585b1a12edfda853f3a5d000993a858"} Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.062008 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d8ddb5b3-36ec-421d-a5d0-f465f7cf0316","Type":"ContainerStarted","Data":"f18120b18836b8ff0f212f7fde855a1163ebfb85430c14ce646c41d4ff43549c"} Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.062230 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.081445 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.081421807 podStartE2EDuration="2.081421807s" podCreationTimestamp="2025-10-05 20:32:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:32:59.078443065 +0000 UTC m=+1087.926771307" watchObservedRunningTime="2025-10-05 20:32:59.081421807 +0000 UTC m=+1087.929750039" Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.130369 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.227341 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-config-data\") pod \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.227391 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-scripts\") pod \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.227481 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7twt\" (UniqueName: \"kubernetes.io/projected/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-kube-api-access-p7twt\") pod \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.227557 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-run-httpd\") pod \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.227631 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-combined-ca-bundle\") pod \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.227652 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-log-httpd\") pod \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.227706 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-sg-core-conf-yaml\") pod \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\" (UID: \"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4\") " Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.230073 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" (UID: "f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.230310 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" (UID: "f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.245739 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-kube-api-access-p7twt" (OuterVolumeSpecName: "kube-api-access-p7twt") pod "f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" (UID: "f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4"). InnerVolumeSpecName "kube-api-access-p7twt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.245870 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-scripts" (OuterVolumeSpecName: "scripts") pod "f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" (UID: "f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.255842 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" (UID: "f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.296507 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" (UID: "f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.321238 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-config-data" (OuterVolumeSpecName: "config-data") pod "f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" (UID: "f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.329329 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7twt\" (UniqueName: \"kubernetes.io/projected/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-kube-api-access-p7twt\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.329369 4753 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.329382 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.329394 4753 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.329405 4753 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.329414 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:32:59 crc kubenswrapper[4753]: I1005 20:32:59.329421 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.092890 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.093836 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4","Type":"ContainerDied","Data":"5a34d799b2d21e502ac4ca672ce4fa56f41398826e93d83119c983d4f19571fb"} Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.093897 4753 scope.go:117] "RemoveContainer" containerID="03676ed7176e8f2b092d6f5287d264b28585b1a12edfda853f3a5d000993a858" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.166115 4753 scope.go:117] "RemoveContainer" containerID="bcd746e8b6eccc4aa8323eda49023536e43cf4a097a02cdd5edb52f5a0dcd56d" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.172405 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.181733 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.187400 4753 scope.go:117] "RemoveContainer" containerID="c1b4561bffbec804356bb7dfe417f821e001b1734ecafdaaaa0a6694618af322" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.219649 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:00 crc kubenswrapper[4753]: E1005 20:33:00.220205 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="proxy-httpd" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.220271 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="proxy-httpd" Oct 05 20:33:00 crc kubenswrapper[4753]: E1005 20:33:00.220329 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="ceilometer-central-agent" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.220377 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="ceilometer-central-agent" Oct 05 20:33:00 crc kubenswrapper[4753]: E1005 20:33:00.220457 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="sg-core" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.220506 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="sg-core" Oct 05 20:33:00 crc kubenswrapper[4753]: E1005 20:33:00.220613 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="ceilometer-notification-agent" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.220671 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="ceilometer-notification-agent" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.221094 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="ceilometer-central-agent" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.221378 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="proxy-httpd" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.221474 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="sg-core" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.221539 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" containerName="ceilometer-notification-agent" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.224484 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.235740 4753 scope.go:117] "RemoveContainer" containerID="fb2864a14bf737b93ea0924247222121c9b43a6800e33f8020ce5da4703e9411" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.236369 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.236408 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.244303 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.353108 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/072980fd-9fc2-464a-8c86-4201a00a3bae-run-httpd\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.353382 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/072980fd-9fc2-464a-8c86-4201a00a3bae-log-httpd\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.353439 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk8n8\" (UniqueName: \"kubernetes.io/projected/072980fd-9fc2-464a-8c86-4201a00a3bae-kube-api-access-pk8n8\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.353520 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.353574 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.353618 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-scripts\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.353645 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-config-data\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.455040 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.455277 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.455364 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-scripts\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.455433 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-config-data\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.455536 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/072980fd-9fc2-464a-8c86-4201a00a3bae-run-httpd\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.455661 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/072980fd-9fc2-464a-8c86-4201a00a3bae-log-httpd\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.455728 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk8n8\" (UniqueName: \"kubernetes.io/projected/072980fd-9fc2-464a-8c86-4201a00a3bae-kube-api-access-pk8n8\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.456982 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/072980fd-9fc2-464a-8c86-4201a00a3bae-log-httpd\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.457458 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/072980fd-9fc2-464a-8c86-4201a00a3bae-run-httpd\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.463573 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.463671 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-scripts\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.464248 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-config-data\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.473448 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk8n8\" (UniqueName: \"kubernetes.io/projected/072980fd-9fc2-464a-8c86-4201a00a3bae-kube-api-access-pk8n8\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.482972 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.560805 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:33:00 crc kubenswrapper[4753]: I1005 20:33:00.985741 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:00 crc kubenswrapper[4753]: W1005 20:33:00.993263 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod072980fd_9fc2_464a_8c86_4201a00a3bae.slice/crio-cf138043f72ad1bd9994af9c39807b7fca6ddc07e00e018b2e2da9b9fe6bc6d3 WatchSource:0}: Error finding container cf138043f72ad1bd9994af9c39807b7fca6ddc07e00e018b2e2da9b9fe6bc6d3: Status 404 returned error can't find the container with id cf138043f72ad1bd9994af9c39807b7fca6ddc07e00e018b2e2da9b9fe6bc6d3 Oct 05 20:33:01 crc kubenswrapper[4753]: I1005 20:33:01.109515 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"072980fd-9fc2-464a-8c86-4201a00a3bae","Type":"ContainerStarted","Data":"cf138043f72ad1bd9994af9c39807b7fca6ddc07e00e018b2e2da9b9fe6bc6d3"} Oct 05 20:33:01 crc kubenswrapper[4753]: I1005 20:33:01.861166 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4" path="/var/lib/kubelet/pods/f11f3bfa-3b3c-4994-9c9f-f73fbc0b60a4/volumes" Oct 05 20:33:02 crc kubenswrapper[4753]: I1005 20:33:02.116824 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"072980fd-9fc2-464a-8c86-4201a00a3bae","Type":"ContainerStarted","Data":"5eb60247cb3ef6af8280a13e7f38dcec1294e0cd49b0053d1ece17713f412310"} Oct 05 20:33:03 crc kubenswrapper[4753]: I1005 20:33:03.129759 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"072980fd-9fc2-464a-8c86-4201a00a3bae","Type":"ContainerStarted","Data":"24888d45a9c6e3091f5b36f909ec1d4356eafe88accf60937c58f1eb4b5a59db"} Oct 05 20:33:03 crc kubenswrapper[4753]: I1005 20:33:03.129797 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"072980fd-9fc2-464a-8c86-4201a00a3bae","Type":"ContainerStarted","Data":"375a2c45be80c2b18c8e6e41b91ad8e21299a50d1ecc3e966854ec9b80d208f3"} Oct 05 20:33:05 crc kubenswrapper[4753]: I1005 20:33:05.169489 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"072980fd-9fc2-464a-8c86-4201a00a3bae","Type":"ContainerStarted","Data":"479a2c9e2d4320203219f6ce98880f91a2d4c825cd0a5eaa5c85ff06f32d8bec"} Oct 05 20:33:05 crc kubenswrapper[4753]: I1005 20:33:05.170105 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 05 20:33:07 crc kubenswrapper[4753]: I1005 20:33:07.485593 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 05 20:33:07 crc kubenswrapper[4753]: I1005 20:33:07.503097 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.679778187 podStartE2EDuration="7.503075515s" podCreationTimestamp="2025-10-05 20:33:00 +0000 UTC" firstStartedPulling="2025-10-05 20:33:00.99539406 +0000 UTC m=+1089.843722282" lastFinishedPulling="2025-10-05 20:33:04.818691378 +0000 UTC m=+1093.667019610" observedRunningTime="2025-10-05 20:33:05.198259609 +0000 UTC m=+1094.046587841" watchObservedRunningTime="2025-10-05 20:33:07.503075515 +0000 UTC m=+1096.351403757" Oct 05 20:33:07 crc kubenswrapper[4753]: I1005 20:33:07.956598 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-kcbx2"] Oct 05 20:33:07 crc kubenswrapper[4753]: I1005 20:33:07.957985 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kcbx2" Oct 05 20:33:07 crc kubenswrapper[4753]: I1005 20:33:07.961834 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 05 20:33:07 crc kubenswrapper[4753]: I1005 20:33:07.962117 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 05 20:33:07 crc kubenswrapper[4753]: I1005 20:33:07.986275 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kcbx2\" (UID: \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\") " pod="openstack/nova-cell0-cell-mapping-kcbx2" Oct 05 20:33:07 crc kubenswrapper[4753]: I1005 20:33:07.986521 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-config-data\") pod \"nova-cell0-cell-mapping-kcbx2\" (UID: \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\") " pod="openstack/nova-cell0-cell-mapping-kcbx2" Oct 05 20:33:07 crc kubenswrapper[4753]: I1005 20:33:07.986741 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-scripts\") pod \"nova-cell0-cell-mapping-kcbx2\" (UID: \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\") " pod="openstack/nova-cell0-cell-mapping-kcbx2" Oct 05 20:33:07 crc kubenswrapper[4753]: I1005 20:33:07.987005 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42pjh\" (UniqueName: \"kubernetes.io/projected/3472112b-9f63-410c-b285-c5a8cd2fa2fc-kube-api-access-42pjh\") pod \"nova-cell0-cell-mapping-kcbx2\" (UID: \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\") " pod="openstack/nova-cell0-cell-mapping-kcbx2" Oct 05 20:33:07 crc kubenswrapper[4753]: I1005 20:33:07.987913 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kcbx2"] Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.088872 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-scripts\") pod \"nova-cell0-cell-mapping-kcbx2\" (UID: \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\") " pod="openstack/nova-cell0-cell-mapping-kcbx2" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.088985 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42pjh\" (UniqueName: \"kubernetes.io/projected/3472112b-9f63-410c-b285-c5a8cd2fa2fc-kube-api-access-42pjh\") pod \"nova-cell0-cell-mapping-kcbx2\" (UID: \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\") " pod="openstack/nova-cell0-cell-mapping-kcbx2" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.089024 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kcbx2\" (UID: \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\") " pod="openstack/nova-cell0-cell-mapping-kcbx2" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.089074 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-config-data\") pod \"nova-cell0-cell-mapping-kcbx2\" (UID: \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\") " pod="openstack/nova-cell0-cell-mapping-kcbx2" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.095400 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kcbx2\" (UID: \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\") " pod="openstack/nova-cell0-cell-mapping-kcbx2" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.096453 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-config-data\") pod \"nova-cell0-cell-mapping-kcbx2\" (UID: \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\") " pod="openstack/nova-cell0-cell-mapping-kcbx2" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.104729 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-scripts\") pod \"nova-cell0-cell-mapping-kcbx2\" (UID: \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\") " pod="openstack/nova-cell0-cell-mapping-kcbx2" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.135123 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42pjh\" (UniqueName: \"kubernetes.io/projected/3472112b-9f63-410c-b285-c5a8cd2fa2fc-kube-api-access-42pjh\") pod \"nova-cell0-cell-mapping-kcbx2\" (UID: \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\") " pod="openstack/nova-cell0-cell-mapping-kcbx2" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.141174 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.142346 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.152529 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.164046 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.204018 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d24ec3-181b-4651-ab1d-59f1975c052a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2d24ec3-181b-4651-ab1d-59f1975c052a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.204164 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d24ec3-181b-4651-ab1d-59f1975c052a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2d24ec3-181b-4651-ab1d-59f1975c052a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.204439 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkrsb\" (UniqueName: \"kubernetes.io/projected/a2d24ec3-181b-4651-ab1d-59f1975c052a-kube-api-access-vkrsb\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2d24ec3-181b-4651-ab1d-59f1975c052a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.279086 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kcbx2" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.300033 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.301422 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.330116 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkrsb\" (UniqueName: \"kubernetes.io/projected/a2d24ec3-181b-4651-ab1d-59f1975c052a-kube-api-access-vkrsb\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2d24ec3-181b-4651-ab1d-59f1975c052a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.330190 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d24ec3-181b-4651-ab1d-59f1975c052a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2d24ec3-181b-4651-ab1d-59f1975c052a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.330223 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d24ec3-181b-4651-ab1d-59f1975c052a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2d24ec3-181b-4651-ab1d-59f1975c052a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.330978 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.354701 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.368554 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d24ec3-181b-4651-ab1d-59f1975c052a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2d24ec3-181b-4651-ab1d-59f1975c052a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.390478 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d24ec3-181b-4651-ab1d-59f1975c052a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2d24ec3-181b-4651-ab1d-59f1975c052a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.395411 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkrsb\" (UniqueName: \"kubernetes.io/projected/a2d24ec3-181b-4651-ab1d-59f1975c052a-kube-api-access-vkrsb\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2d24ec3-181b-4651-ab1d-59f1975c052a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.409715 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.411200 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.418522 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.432889 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4daa75de-d231-4306-bc81-a5b6a77df4ff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4daa75de-d231-4306-bc81-a5b6a77df4ff\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.434576 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.438309 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.440392 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjhkd\" (UniqueName: \"kubernetes.io/projected/4daa75de-d231-4306-bc81-a5b6a77df4ff-kube-api-access-vjhkd\") pod \"nova-scheduler-0\" (UID: \"4daa75de-d231-4306-bc81-a5b6a77df4ff\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.440462 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4daa75de-d231-4306-bc81-a5b6a77df4ff-config-data\") pod \"nova-scheduler-0\" (UID: \"4daa75de-d231-4306-bc81-a5b6a77df4ff\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.454270 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.461381 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.481919 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.523932 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.545201 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a332dcd-7d9c-402c-9560-361e59390857-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a332dcd-7d9c-402c-9560-361e59390857\") " pod="openstack/nova-api-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.545382 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1530fd-875d-4fdb-aa54-e1faa5df1745-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"be1530fd-875d-4fdb-aa54-e1faa5df1745\") " pod="openstack/nova-metadata-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.545412 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4daa75de-d231-4306-bc81-a5b6a77df4ff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4daa75de-d231-4306-bc81-a5b6a77df4ff\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.545438 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a332dcd-7d9c-402c-9560-361e59390857-config-data\") pod \"nova-api-0\" (UID: \"8a332dcd-7d9c-402c-9560-361e59390857\") " pod="openstack/nova-api-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.545456 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be1530fd-875d-4fdb-aa54-e1faa5df1745-logs\") pod \"nova-metadata-0\" (UID: \"be1530fd-875d-4fdb-aa54-e1faa5df1745\") " pod="openstack/nova-metadata-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.545486 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhtz4\" (UniqueName: \"kubernetes.io/projected/8a332dcd-7d9c-402c-9560-361e59390857-kube-api-access-zhtz4\") pod \"nova-api-0\" (UID: \"8a332dcd-7d9c-402c-9560-361e59390857\") " pod="openstack/nova-api-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.545503 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlh5c\" (UniqueName: \"kubernetes.io/projected/be1530fd-875d-4fdb-aa54-e1faa5df1745-kube-api-access-vlh5c\") pod \"nova-metadata-0\" (UID: \"be1530fd-875d-4fdb-aa54-e1faa5df1745\") " pod="openstack/nova-metadata-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.545536 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a332dcd-7d9c-402c-9560-361e59390857-logs\") pod \"nova-api-0\" (UID: \"8a332dcd-7d9c-402c-9560-361e59390857\") " pod="openstack/nova-api-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.545558 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjhkd\" (UniqueName: \"kubernetes.io/projected/4daa75de-d231-4306-bc81-a5b6a77df4ff-kube-api-access-vjhkd\") pod \"nova-scheduler-0\" (UID: \"4daa75de-d231-4306-bc81-a5b6a77df4ff\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.545573 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1530fd-875d-4fdb-aa54-e1faa5df1745-config-data\") pod \"nova-metadata-0\" (UID: \"be1530fd-875d-4fdb-aa54-e1faa5df1745\") " pod="openstack/nova-metadata-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.545590 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4daa75de-d231-4306-bc81-a5b6a77df4ff-config-data\") pod \"nova-scheduler-0\" (UID: \"4daa75de-d231-4306-bc81-a5b6a77df4ff\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.552588 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4daa75de-d231-4306-bc81-a5b6a77df4ff-config-data\") pod \"nova-scheduler-0\" (UID: \"4daa75de-d231-4306-bc81-a5b6a77df4ff\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.561952 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4daa75de-d231-4306-bc81-a5b6a77df4ff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4daa75de-d231-4306-bc81-a5b6a77df4ff\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.565811 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b86468d5c-ghpzp"] Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.567726 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.577491 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjhkd\" (UniqueName: \"kubernetes.io/projected/4daa75de-d231-4306-bc81-a5b6a77df4ff-kube-api-access-vjhkd\") pod \"nova-scheduler-0\" (UID: \"4daa75de-d231-4306-bc81-a5b6a77df4ff\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.581501 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b86468d5c-ghpzp"] Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.650044 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a332dcd-7d9c-402c-9560-361e59390857-logs\") pod \"nova-api-0\" (UID: \"8a332dcd-7d9c-402c-9560-361e59390857\") " pod="openstack/nova-api-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.650102 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1530fd-875d-4fdb-aa54-e1faa5df1745-config-data\") pod \"nova-metadata-0\" (UID: \"be1530fd-875d-4fdb-aa54-e1faa5df1745\") " pod="openstack/nova-metadata-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.650169 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-dns-svc\") pod \"dnsmasq-dns-7b86468d5c-ghpzp\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.650198 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45qmh\" (UniqueName: \"kubernetes.io/projected/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-kube-api-access-45qmh\") pod \"dnsmasq-dns-7b86468d5c-ghpzp\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.650272 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a332dcd-7d9c-402c-9560-361e59390857-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a332dcd-7d9c-402c-9560-361e59390857\") " pod="openstack/nova-api-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.650311 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1530fd-875d-4fdb-aa54-e1faa5df1745-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"be1530fd-875d-4fdb-aa54-e1faa5df1745\") " pod="openstack/nova-metadata-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.650342 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-config\") pod \"dnsmasq-dns-7b86468d5c-ghpzp\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.650362 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a332dcd-7d9c-402c-9560-361e59390857-config-data\") pod \"nova-api-0\" (UID: \"8a332dcd-7d9c-402c-9560-361e59390857\") " pod="openstack/nova-api-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.650379 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be1530fd-875d-4fdb-aa54-e1faa5df1745-logs\") pod \"nova-metadata-0\" (UID: \"be1530fd-875d-4fdb-aa54-e1faa5df1745\") " pod="openstack/nova-metadata-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.650405 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-ovsdbserver-nb\") pod \"dnsmasq-dns-7b86468d5c-ghpzp\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.650426 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhtz4\" (UniqueName: \"kubernetes.io/projected/8a332dcd-7d9c-402c-9560-361e59390857-kube-api-access-zhtz4\") pod \"nova-api-0\" (UID: \"8a332dcd-7d9c-402c-9560-361e59390857\") " pod="openstack/nova-api-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.650445 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlh5c\" (UniqueName: \"kubernetes.io/projected/be1530fd-875d-4fdb-aa54-e1faa5df1745-kube-api-access-vlh5c\") pod \"nova-metadata-0\" (UID: \"be1530fd-875d-4fdb-aa54-e1faa5df1745\") " pod="openstack/nova-metadata-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.650464 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-ovsdbserver-sb\") pod \"dnsmasq-dns-7b86468d5c-ghpzp\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.650873 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a332dcd-7d9c-402c-9560-361e59390857-logs\") pod \"nova-api-0\" (UID: \"8a332dcd-7d9c-402c-9560-361e59390857\") " pod="openstack/nova-api-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.654266 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be1530fd-875d-4fdb-aa54-e1faa5df1745-logs\") pod \"nova-metadata-0\" (UID: \"be1530fd-875d-4fdb-aa54-e1faa5df1745\") " pod="openstack/nova-metadata-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.663298 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1530fd-875d-4fdb-aa54-e1faa5df1745-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"be1530fd-875d-4fdb-aa54-e1faa5df1745\") " pod="openstack/nova-metadata-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.671616 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1530fd-875d-4fdb-aa54-e1faa5df1745-config-data\") pod \"nova-metadata-0\" (UID: \"be1530fd-875d-4fdb-aa54-e1faa5df1745\") " pod="openstack/nova-metadata-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.672736 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a332dcd-7d9c-402c-9560-361e59390857-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a332dcd-7d9c-402c-9560-361e59390857\") " pod="openstack/nova-api-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.679167 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a332dcd-7d9c-402c-9560-361e59390857-config-data\") pod \"nova-api-0\" (UID: \"8a332dcd-7d9c-402c-9560-361e59390857\") " pod="openstack/nova-api-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.687717 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhtz4\" (UniqueName: \"kubernetes.io/projected/8a332dcd-7d9c-402c-9560-361e59390857-kube-api-access-zhtz4\") pod \"nova-api-0\" (UID: \"8a332dcd-7d9c-402c-9560-361e59390857\") " pod="openstack/nova-api-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.688234 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlh5c\" (UniqueName: \"kubernetes.io/projected/be1530fd-875d-4fdb-aa54-e1faa5df1745-kube-api-access-vlh5c\") pod \"nova-metadata-0\" (UID: \"be1530fd-875d-4fdb-aa54-e1faa5df1745\") " pod="openstack/nova-metadata-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.735985 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.755986 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-dns-svc\") pod \"dnsmasq-dns-7b86468d5c-ghpzp\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.756030 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45qmh\" (UniqueName: \"kubernetes.io/projected/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-kube-api-access-45qmh\") pod \"dnsmasq-dns-7b86468d5c-ghpzp\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.756102 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-config\") pod \"dnsmasq-dns-7b86468d5c-ghpzp\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.756150 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-ovsdbserver-nb\") pod \"dnsmasq-dns-7b86468d5c-ghpzp\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.756176 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-ovsdbserver-sb\") pod \"dnsmasq-dns-7b86468d5c-ghpzp\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.757287 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-ovsdbserver-sb\") pod \"dnsmasq-dns-7b86468d5c-ghpzp\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.757783 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-dns-svc\") pod \"dnsmasq-dns-7b86468d5c-ghpzp\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.758539 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-config\") pod \"dnsmasq-dns-7b86468d5c-ghpzp\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.759047 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-ovsdbserver-nb\") pod \"dnsmasq-dns-7b86468d5c-ghpzp\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.784291 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.786168 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45qmh\" (UniqueName: \"kubernetes.io/projected/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-kube-api-access-45qmh\") pod \"dnsmasq-dns-7b86468d5c-ghpzp\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.840534 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.896410 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:08 crc kubenswrapper[4753]: I1005 20:33:08.921991 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kcbx2"] Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.192022 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.241941 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a2d24ec3-181b-4651-ab1d-59f1975c052a","Type":"ContainerStarted","Data":"b8572bee91d93572d19249f9bcbae54b042c1c9fd4849e0d0dc5b355d91d660e"} Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.242676 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kcbx2" event={"ID":"3472112b-9f63-410c-b285-c5a8cd2fa2fc","Type":"ContainerStarted","Data":"f280e8601c14118a24590f9728fd18f82bfe2629b9a621503ee6387a32f0465f"} Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.300250 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.446593 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:33:09 crc kubenswrapper[4753]: W1005 20:33:09.474418 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a332dcd_7d9c_402c_9560_361e59390857.slice/crio-e7b48bddffb0a53dc7c20d091ef3f285e91d3169ff7948b817ae23ee8fbc4bed WatchSource:0}: Error finding container e7b48bddffb0a53dc7c20d091ef3f285e91d3169ff7948b817ae23ee8fbc4bed: Status 404 returned error can't find the container with id e7b48bddffb0a53dc7c20d091ef3f285e91d3169ff7948b817ae23ee8fbc4bed Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.556676 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.620306 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b86468d5c-ghpzp"] Oct 05 20:33:09 crc kubenswrapper[4753]: W1005 20:33:09.623282 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bcf51ef_4faa_4209_bbf8_b17d7177d0b5.slice/crio-1ba09cfc3f029affaee102d88168273ad1ea1fa79163b1a6fe243d1258d745f5 WatchSource:0}: Error finding container 1ba09cfc3f029affaee102d88168273ad1ea1fa79163b1a6fe243d1258d745f5: Status 404 returned error can't find the container with id 1ba09cfc3f029affaee102d88168273ad1ea1fa79163b1a6fe243d1258d745f5 Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.701809 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7qts4"] Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.703213 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7qts4" Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.707458 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.707854 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.726779 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7qts4"] Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.782728 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-config-data\") pod \"nova-cell1-conductor-db-sync-7qts4\" (UID: \"72e71171-91fe-4161-9899-93934608eaa2\") " pod="openstack/nova-cell1-conductor-db-sync-7qts4" Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.782804 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7qts4\" (UID: \"72e71171-91fe-4161-9899-93934608eaa2\") " pod="openstack/nova-cell1-conductor-db-sync-7qts4" Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.782877 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-scripts\") pod \"nova-cell1-conductor-db-sync-7qts4\" (UID: \"72e71171-91fe-4161-9899-93934608eaa2\") " pod="openstack/nova-cell1-conductor-db-sync-7qts4" Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.782927 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn5dw\" (UniqueName: \"kubernetes.io/projected/72e71171-91fe-4161-9899-93934608eaa2-kube-api-access-xn5dw\") pod \"nova-cell1-conductor-db-sync-7qts4\" (UID: \"72e71171-91fe-4161-9899-93934608eaa2\") " pod="openstack/nova-cell1-conductor-db-sync-7qts4" Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.885073 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-scripts\") pod \"nova-cell1-conductor-db-sync-7qts4\" (UID: \"72e71171-91fe-4161-9899-93934608eaa2\") " pod="openstack/nova-cell1-conductor-db-sync-7qts4" Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.887443 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn5dw\" (UniqueName: \"kubernetes.io/projected/72e71171-91fe-4161-9899-93934608eaa2-kube-api-access-xn5dw\") pod \"nova-cell1-conductor-db-sync-7qts4\" (UID: \"72e71171-91fe-4161-9899-93934608eaa2\") " pod="openstack/nova-cell1-conductor-db-sync-7qts4" Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.887579 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-config-data\") pod \"nova-cell1-conductor-db-sync-7qts4\" (UID: \"72e71171-91fe-4161-9899-93934608eaa2\") " pod="openstack/nova-cell1-conductor-db-sync-7qts4" Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.887701 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7qts4\" (UID: \"72e71171-91fe-4161-9899-93934608eaa2\") " pod="openstack/nova-cell1-conductor-db-sync-7qts4" Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.891492 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-scripts\") pod \"nova-cell1-conductor-db-sync-7qts4\" (UID: \"72e71171-91fe-4161-9899-93934608eaa2\") " pod="openstack/nova-cell1-conductor-db-sync-7qts4" Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.893920 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7qts4\" (UID: \"72e71171-91fe-4161-9899-93934608eaa2\") " pod="openstack/nova-cell1-conductor-db-sync-7qts4" Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.894937 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-config-data\") pod \"nova-cell1-conductor-db-sync-7qts4\" (UID: \"72e71171-91fe-4161-9899-93934608eaa2\") " pod="openstack/nova-cell1-conductor-db-sync-7qts4" Oct 05 20:33:09 crc kubenswrapper[4753]: I1005 20:33:09.911541 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn5dw\" (UniqueName: \"kubernetes.io/projected/72e71171-91fe-4161-9899-93934608eaa2-kube-api-access-xn5dw\") pod \"nova-cell1-conductor-db-sync-7qts4\" (UID: \"72e71171-91fe-4161-9899-93934608eaa2\") " pod="openstack/nova-cell1-conductor-db-sync-7qts4" Oct 05 20:33:10 crc kubenswrapper[4753]: I1005 20:33:10.021768 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7qts4" Oct 05 20:33:10 crc kubenswrapper[4753]: I1005 20:33:10.319905 4753 generic.go:334] "Generic (PLEG): container finished" podID="9bcf51ef-4faa-4209-bbf8-b17d7177d0b5" containerID="b60e4d3dbb25d2cdb7f2c61ebedbaf573e141c1737917000b62dcac0181a8aaa" exitCode=0 Oct 05 20:33:10 crc kubenswrapper[4753]: I1005 20:33:10.320190 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" event={"ID":"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5","Type":"ContainerDied","Data":"b60e4d3dbb25d2cdb7f2c61ebedbaf573e141c1737917000b62dcac0181a8aaa"} Oct 05 20:33:10 crc kubenswrapper[4753]: I1005 20:33:10.320282 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" event={"ID":"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5","Type":"ContainerStarted","Data":"1ba09cfc3f029affaee102d88168273ad1ea1fa79163b1a6fe243d1258d745f5"} Oct 05 20:33:10 crc kubenswrapper[4753]: I1005 20:33:10.363572 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kcbx2" event={"ID":"3472112b-9f63-410c-b285-c5a8cd2fa2fc","Type":"ContainerStarted","Data":"8b596b16decd9c2626d87ec85a3e00bbf094c8b9354c3ffb26d418d739010f29"} Oct 05 20:33:10 crc kubenswrapper[4753]: I1005 20:33:10.395262 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-kcbx2" podStartSLOduration=3.395247395 podStartE2EDuration="3.395247395s" podCreationTimestamp="2025-10-05 20:33:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:33:10.392895952 +0000 UTC m=+1099.241224184" watchObservedRunningTime="2025-10-05 20:33:10.395247395 +0000 UTC m=+1099.243575627" Oct 05 20:33:10 crc kubenswrapper[4753]: I1005 20:33:10.406497 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4daa75de-d231-4306-bc81-a5b6a77df4ff","Type":"ContainerStarted","Data":"e74cf2d59c74e2261a39377b3cf5acac24ed7ecb822588ab08b888455d31091b"} Oct 05 20:33:10 crc kubenswrapper[4753]: I1005 20:33:10.430609 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a332dcd-7d9c-402c-9560-361e59390857","Type":"ContainerStarted","Data":"e7b48bddffb0a53dc7c20d091ef3f285e91d3169ff7948b817ae23ee8fbc4bed"} Oct 05 20:33:10 crc kubenswrapper[4753]: I1005 20:33:10.440017 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"be1530fd-875d-4fdb-aa54-e1faa5df1745","Type":"ContainerStarted","Data":"ca7fb1aad33d0f4e5e00955095b040f1e2a29b213cd699338780c7f48383407e"} Oct 05 20:33:10 crc kubenswrapper[4753]: I1005 20:33:10.613358 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7qts4"] Oct 05 20:33:11 crc kubenswrapper[4753]: I1005 20:33:11.458591 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" event={"ID":"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5","Type":"ContainerStarted","Data":"61cf56781e7162cf823930e3e3d0b49ad95f64dcee4252339d9cabb15b161200"} Oct 05 20:33:11 crc kubenswrapper[4753]: I1005 20:33:11.458991 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:11 crc kubenswrapper[4753]: I1005 20:33:11.466056 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7qts4" event={"ID":"72e71171-91fe-4161-9899-93934608eaa2","Type":"ContainerStarted","Data":"7bd3f75ff782139c51c0348d117f57215ec660be1f07b009d61f048144dc9b0a"} Oct 05 20:33:11 crc kubenswrapper[4753]: I1005 20:33:11.466094 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7qts4" event={"ID":"72e71171-91fe-4161-9899-93934608eaa2","Type":"ContainerStarted","Data":"de2759b49b18bc5643bb2b8a5d7f4e597d814b85507fe04865ad19431e5ad114"} Oct 05 20:33:11 crc kubenswrapper[4753]: I1005 20:33:11.487415 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" podStartSLOduration=3.4873973449999998 podStartE2EDuration="3.487397345s" podCreationTimestamp="2025-10-05 20:33:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:33:11.481969175 +0000 UTC m=+1100.330297407" watchObservedRunningTime="2025-10-05 20:33:11.487397345 +0000 UTC m=+1100.335725577" Oct 05 20:33:11 crc kubenswrapper[4753]: I1005 20:33:11.511881 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7qts4" podStartSLOduration=2.511860809 podStartE2EDuration="2.511860809s" podCreationTimestamp="2025-10-05 20:33:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:33:11.5093017 +0000 UTC m=+1100.357629932" watchObservedRunningTime="2025-10-05 20:33:11.511860809 +0000 UTC m=+1100.360189042" Oct 05 20:33:12 crc kubenswrapper[4753]: I1005 20:33:12.217270 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:33:12 crc kubenswrapper[4753]: I1005 20:33:12.226753 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 05 20:33:15 crc kubenswrapper[4753]: I1005 20:33:15.526072 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a332dcd-7d9c-402c-9560-361e59390857","Type":"ContainerStarted","Data":"5a6141a781d1d8f165f88e914263568ae13298e5d59c73c43301c6bf27649d46"} Oct 05 20:33:15 crc kubenswrapper[4753]: I1005 20:33:15.527050 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a332dcd-7d9c-402c-9560-361e59390857","Type":"ContainerStarted","Data":"e13d5b7f11e8bd358808a7cacb27e809be0ea56befb2ed72d0e65d4124f9626a"} Oct 05 20:33:15 crc kubenswrapper[4753]: I1005 20:33:15.529607 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4daa75de-d231-4306-bc81-a5b6a77df4ff","Type":"ContainerStarted","Data":"884259d45d596d1afa1c7ab8d82e648224698915c6bbab555e6afb2b8f9048e3"} Oct 05 20:33:15 crc kubenswrapper[4753]: I1005 20:33:15.536677 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a2d24ec3-181b-4651-ab1d-59f1975c052a","Type":"ContainerStarted","Data":"b87b29ecafec68b68c54b7c5e2d72250c93a4358248dd72cbd1a0369ae13f570"} Oct 05 20:33:15 crc kubenswrapper[4753]: I1005 20:33:15.536766 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a2d24ec3-181b-4651-ab1d-59f1975c052a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b87b29ecafec68b68c54b7c5e2d72250c93a4358248dd72cbd1a0369ae13f570" gracePeriod=30 Oct 05 20:33:15 crc kubenswrapper[4753]: I1005 20:33:15.539014 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"be1530fd-875d-4fdb-aa54-e1faa5df1745","Type":"ContainerStarted","Data":"63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db"} Oct 05 20:33:15 crc kubenswrapper[4753]: I1005 20:33:15.539057 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"be1530fd-875d-4fdb-aa54-e1faa5df1745","Type":"ContainerStarted","Data":"a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10"} Oct 05 20:33:15 crc kubenswrapper[4753]: I1005 20:33:15.539166 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="be1530fd-875d-4fdb-aa54-e1faa5df1745" containerName="nova-metadata-log" containerID="cri-o://a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10" gracePeriod=30 Oct 05 20:33:15 crc kubenswrapper[4753]: I1005 20:33:15.539290 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="be1530fd-875d-4fdb-aa54-e1faa5df1745" containerName="nova-metadata-metadata" containerID="cri-o://63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db" gracePeriod=30 Oct 05 20:33:15 crc kubenswrapper[4753]: I1005 20:33:15.554032 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.547302428 podStartE2EDuration="7.554014728s" podCreationTimestamp="2025-10-05 20:33:08 +0000 UTC" firstStartedPulling="2025-10-05 20:33:09.476177965 +0000 UTC m=+1098.324506197" lastFinishedPulling="2025-10-05 20:33:14.482890265 +0000 UTC m=+1103.331218497" observedRunningTime="2025-10-05 20:33:15.544298194 +0000 UTC m=+1104.392626426" watchObservedRunningTime="2025-10-05 20:33:15.554014728 +0000 UTC m=+1104.402342980" Oct 05 20:33:15 crc kubenswrapper[4753]: I1005 20:33:15.569022 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.303064096 podStartE2EDuration="7.569004536s" podCreationTimestamp="2025-10-05 20:33:08 +0000 UTC" firstStartedPulling="2025-10-05 20:33:09.214554719 +0000 UTC m=+1098.062882941" lastFinishedPulling="2025-10-05 20:33:14.480495159 +0000 UTC m=+1103.328823381" observedRunningTime="2025-10-05 20:33:15.5668728 +0000 UTC m=+1104.415201052" watchObservedRunningTime="2025-10-05 20:33:15.569004536 +0000 UTC m=+1104.417332778" Oct 05 20:33:15 crc kubenswrapper[4753]: I1005 20:33:15.588541 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.453563377 podStartE2EDuration="7.588526475s" podCreationTimestamp="2025-10-05 20:33:08 +0000 UTC" firstStartedPulling="2025-10-05 20:33:09.344311783 +0000 UTC m=+1098.192640015" lastFinishedPulling="2025-10-05 20:33:14.479274891 +0000 UTC m=+1103.327603113" observedRunningTime="2025-10-05 20:33:15.58195828 +0000 UTC m=+1104.430286512" watchObservedRunningTime="2025-10-05 20:33:15.588526475 +0000 UTC m=+1104.436854707" Oct 05 20:33:15 crc kubenswrapper[4753]: I1005 20:33:15.611630 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.673999906 podStartE2EDuration="7.611612647s" podCreationTimestamp="2025-10-05 20:33:08 +0000 UTC" firstStartedPulling="2025-10-05 20:33:09.544424517 +0000 UTC m=+1098.392752749" lastFinishedPulling="2025-10-05 20:33:14.482037258 +0000 UTC m=+1103.330365490" observedRunningTime="2025-10-05 20:33:15.603320688 +0000 UTC m=+1104.451648920" watchObservedRunningTime="2025-10-05 20:33:15.611612647 +0000 UTC m=+1104.459940879" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.522652 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.575623 4753 generic.go:334] "Generic (PLEG): container finished" podID="be1530fd-875d-4fdb-aa54-e1faa5df1745" containerID="63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db" exitCode=0 Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.575658 4753 generic.go:334] "Generic (PLEG): container finished" podID="be1530fd-875d-4fdb-aa54-e1faa5df1745" containerID="a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10" exitCode=143 Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.575747 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"be1530fd-875d-4fdb-aa54-e1faa5df1745","Type":"ContainerDied","Data":"63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db"} Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.575832 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"be1530fd-875d-4fdb-aa54-e1faa5df1745","Type":"ContainerDied","Data":"a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10"} Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.575849 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"be1530fd-875d-4fdb-aa54-e1faa5df1745","Type":"ContainerDied","Data":"ca7fb1aad33d0f4e5e00955095b040f1e2a29b213cd699338780c7f48383407e"} Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.575868 4753 scope.go:117] "RemoveContainer" containerID="63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.576161 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.598848 4753 scope.go:117] "RemoveContainer" containerID="a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.615113 4753 scope.go:117] "RemoveContainer" containerID="63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db" Oct 05 20:33:16 crc kubenswrapper[4753]: E1005 20:33:16.615429 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db\": container with ID starting with 63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db not found: ID does not exist" containerID="63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.615454 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db"} err="failed to get container status \"63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db\": rpc error: code = NotFound desc = could not find container \"63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db\": container with ID starting with 63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db not found: ID does not exist" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.615473 4753 scope.go:117] "RemoveContainer" containerID="a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10" Oct 05 20:33:16 crc kubenswrapper[4753]: E1005 20:33:16.615686 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10\": container with ID starting with a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10 not found: ID does not exist" containerID="a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.615707 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10"} err="failed to get container status \"a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10\": rpc error: code = NotFound desc = could not find container \"a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10\": container with ID starting with a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10 not found: ID does not exist" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.615719 4753 scope.go:117] "RemoveContainer" containerID="63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.615878 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db"} err="failed to get container status \"63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db\": rpc error: code = NotFound desc = could not find container \"63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db\": container with ID starting with 63b00a4c99372c7532a0dc381f6b0cdeacf487369bca28a4c24b61c6189389db not found: ID does not exist" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.615893 4753 scope.go:117] "RemoveContainer" containerID="a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.616362 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10"} err="failed to get container status \"a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10\": rpc error: code = NotFound desc = could not find container \"a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10\": container with ID starting with a0377e062a4e339a775da833d5ec5c9db9493cc6119589adfddccae39eddde10 not found: ID does not exist" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.635565 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1530fd-875d-4fdb-aa54-e1faa5df1745-config-data\") pod \"be1530fd-875d-4fdb-aa54-e1faa5df1745\" (UID: \"be1530fd-875d-4fdb-aa54-e1faa5df1745\") " Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.635597 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be1530fd-875d-4fdb-aa54-e1faa5df1745-logs\") pod \"be1530fd-875d-4fdb-aa54-e1faa5df1745\" (UID: \"be1530fd-875d-4fdb-aa54-e1faa5df1745\") " Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.635715 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1530fd-875d-4fdb-aa54-e1faa5df1745-combined-ca-bundle\") pod \"be1530fd-875d-4fdb-aa54-e1faa5df1745\" (UID: \"be1530fd-875d-4fdb-aa54-e1faa5df1745\") " Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.635742 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlh5c\" (UniqueName: \"kubernetes.io/projected/be1530fd-875d-4fdb-aa54-e1faa5df1745-kube-api-access-vlh5c\") pod \"be1530fd-875d-4fdb-aa54-e1faa5df1745\" (UID: \"be1530fd-875d-4fdb-aa54-e1faa5df1745\") " Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.638429 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be1530fd-875d-4fdb-aa54-e1faa5df1745-logs" (OuterVolumeSpecName: "logs") pod "be1530fd-875d-4fdb-aa54-e1faa5df1745" (UID: "be1530fd-875d-4fdb-aa54-e1faa5df1745"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.668573 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be1530fd-875d-4fdb-aa54-e1faa5df1745-kube-api-access-vlh5c" (OuterVolumeSpecName: "kube-api-access-vlh5c") pod "be1530fd-875d-4fdb-aa54-e1faa5df1745" (UID: "be1530fd-875d-4fdb-aa54-e1faa5df1745"). InnerVolumeSpecName "kube-api-access-vlh5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.680762 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be1530fd-875d-4fdb-aa54-e1faa5df1745-config-data" (OuterVolumeSpecName: "config-data") pod "be1530fd-875d-4fdb-aa54-e1faa5df1745" (UID: "be1530fd-875d-4fdb-aa54-e1faa5df1745"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.693245 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be1530fd-875d-4fdb-aa54-e1faa5df1745-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be1530fd-875d-4fdb-aa54-e1faa5df1745" (UID: "be1530fd-875d-4fdb-aa54-e1faa5df1745"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.738638 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be1530fd-875d-4fdb-aa54-e1faa5df1745-logs\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.738682 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1530fd-875d-4fdb-aa54-e1faa5df1745-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.738696 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlh5c\" (UniqueName: \"kubernetes.io/projected/be1530fd-875d-4fdb-aa54-e1faa5df1745-kube-api-access-vlh5c\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.738708 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1530fd-875d-4fdb-aa54-e1faa5df1745-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.909264 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.916367 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.926363 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:33:16 crc kubenswrapper[4753]: E1005 20:33:16.926774 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be1530fd-875d-4fdb-aa54-e1faa5df1745" containerName="nova-metadata-metadata" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.926796 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="be1530fd-875d-4fdb-aa54-e1faa5df1745" containerName="nova-metadata-metadata" Oct 05 20:33:16 crc kubenswrapper[4753]: E1005 20:33:16.927159 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be1530fd-875d-4fdb-aa54-e1faa5df1745" containerName="nova-metadata-log" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.927174 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="be1530fd-875d-4fdb-aa54-e1faa5df1745" containerName="nova-metadata-log" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.927364 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="be1530fd-875d-4fdb-aa54-e1faa5df1745" containerName="nova-metadata-log" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.927389 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="be1530fd-875d-4fdb-aa54-e1faa5df1745" containerName="nova-metadata-metadata" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.928263 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.930769 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.938092 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 05 20:33:16 crc kubenswrapper[4753]: I1005 20:33:16.945743 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.044177 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-logs\") pod \"nova-metadata-0\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " pod="openstack/nova-metadata-0" Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.044507 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " pod="openstack/nova-metadata-0" Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.044531 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " pod="openstack/nova-metadata-0" Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.044551 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-config-data\") pod \"nova-metadata-0\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " pod="openstack/nova-metadata-0" Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.044586 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8c5c\" (UniqueName: \"kubernetes.io/projected/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-kube-api-access-v8c5c\") pod \"nova-metadata-0\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " pod="openstack/nova-metadata-0" Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.145853 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-logs\") pod \"nova-metadata-0\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " pod="openstack/nova-metadata-0" Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.145895 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " pod="openstack/nova-metadata-0" Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.145915 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " pod="openstack/nova-metadata-0" Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.145930 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-config-data\") pod \"nova-metadata-0\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " pod="openstack/nova-metadata-0" Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.145963 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8c5c\" (UniqueName: \"kubernetes.io/projected/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-kube-api-access-v8c5c\") pod \"nova-metadata-0\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " pod="openstack/nova-metadata-0" Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.146211 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-logs\") pod \"nova-metadata-0\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " pod="openstack/nova-metadata-0" Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.151362 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " pod="openstack/nova-metadata-0" Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.152603 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-config-data\") pod \"nova-metadata-0\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " pod="openstack/nova-metadata-0" Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.153733 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " pod="openstack/nova-metadata-0" Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.167730 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8c5c\" (UniqueName: \"kubernetes.io/projected/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-kube-api-access-v8c5c\") pod \"nova-metadata-0\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " pod="openstack/nova-metadata-0" Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.253987 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.588859 4753 generic.go:334] "Generic (PLEG): container finished" podID="3472112b-9f63-410c-b285-c5a8cd2fa2fc" containerID="8b596b16decd9c2626d87ec85a3e00bbf094c8b9354c3ffb26d418d739010f29" exitCode=0 Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.588940 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kcbx2" event={"ID":"3472112b-9f63-410c-b285-c5a8cd2fa2fc","Type":"ContainerDied","Data":"8b596b16decd9c2626d87ec85a3e00bbf094c8b9354c3ffb26d418d739010f29"} Oct 05 20:33:17 crc kubenswrapper[4753]: W1005 20:33:17.697048 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb27db7b7_34b0_4d4d_bdc0_33e3ece8078c.slice/crio-4c1f7c1a67e4cce4d94ef78cb3750a53f1422a961ad8eced4433e2315da9d485 WatchSource:0}: Error finding container 4c1f7c1a67e4cce4d94ef78cb3750a53f1422a961ad8eced4433e2315da9d485: Status 404 returned error can't find the container with id 4c1f7c1a67e4cce4d94ef78cb3750a53f1422a961ad8eced4433e2315da9d485 Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.699611 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:33:17 crc kubenswrapper[4753]: I1005 20:33:17.875774 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be1530fd-875d-4fdb-aa54-e1faa5df1745" path="/var/lib/kubelet/pods/be1530fd-875d-4fdb-aa54-e1faa5df1745/volumes" Oct 05 20:33:18 crc kubenswrapper[4753]: I1005 20:33:18.524529 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:18 crc kubenswrapper[4753]: I1005 20:33:18.603251 4753 generic.go:334] "Generic (PLEG): container finished" podID="72e71171-91fe-4161-9899-93934608eaa2" containerID="7bd3f75ff782139c51c0348d117f57215ec660be1f07b009d61f048144dc9b0a" exitCode=0 Oct 05 20:33:18 crc kubenswrapper[4753]: I1005 20:33:18.603306 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7qts4" event={"ID":"72e71171-91fe-4161-9899-93934608eaa2","Type":"ContainerDied","Data":"7bd3f75ff782139c51c0348d117f57215ec660be1f07b009d61f048144dc9b0a"} Oct 05 20:33:18 crc kubenswrapper[4753]: I1005 20:33:18.608250 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c","Type":"ContainerStarted","Data":"18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3"} Oct 05 20:33:18 crc kubenswrapper[4753]: I1005 20:33:18.608310 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c","Type":"ContainerStarted","Data":"977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5"} Oct 05 20:33:18 crc kubenswrapper[4753]: I1005 20:33:18.608336 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c","Type":"ContainerStarted","Data":"4c1f7c1a67e4cce4d94ef78cb3750a53f1422a961ad8eced4433e2315da9d485"} Oct 05 20:33:18 crc kubenswrapper[4753]: I1005 20:33:18.663621 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.663600692 podStartE2EDuration="2.663600692s" podCreationTimestamp="2025-10-05 20:33:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:33:18.649866213 +0000 UTC m=+1107.498194465" watchObservedRunningTime="2025-10-05 20:33:18.663600692 +0000 UTC m=+1107.511928944" Oct 05 20:33:18 crc kubenswrapper[4753]: I1005 20:33:18.736933 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 05 20:33:18 crc kubenswrapper[4753]: I1005 20:33:18.737317 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 05 20:33:18 crc kubenswrapper[4753]: I1005 20:33:18.785801 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 05 20:33:18 crc kubenswrapper[4753]: I1005 20:33:18.785841 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 05 20:33:18 crc kubenswrapper[4753]: I1005 20:33:18.793175 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 05 20:33:18 crc kubenswrapper[4753]: I1005 20:33:18.898311 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:18 crc kubenswrapper[4753]: I1005 20:33:18.953867 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bb644bd75-f6958"] Oct 05 20:33:18 crc kubenswrapper[4753]: I1005 20:33:18.954077 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bb644bd75-f6958" podUID="2de204c7-6e5d-4369-abaf-139ec0d2edcb" containerName="dnsmasq-dns" containerID="cri-o://0ce7dac14ce3c0fb4bc4337c1689abe41e7ddea7d1244b11d052605e935a8ee6" gracePeriod=10 Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.072755 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kcbx2" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.182947 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-config-data\") pod \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\" (UID: \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\") " Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.183180 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-scripts\") pod \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\" (UID: \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\") " Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.183302 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42pjh\" (UniqueName: \"kubernetes.io/projected/3472112b-9f63-410c-b285-c5a8cd2fa2fc-kube-api-access-42pjh\") pod \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\" (UID: \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\") " Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.183358 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-combined-ca-bundle\") pod \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\" (UID: \"3472112b-9f63-410c-b285-c5a8cd2fa2fc\") " Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.188242 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3472112b-9f63-410c-b285-c5a8cd2fa2fc-kube-api-access-42pjh" (OuterVolumeSpecName: "kube-api-access-42pjh") pod "3472112b-9f63-410c-b285-c5a8cd2fa2fc" (UID: "3472112b-9f63-410c-b285-c5a8cd2fa2fc"). InnerVolumeSpecName "kube-api-access-42pjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.188525 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-scripts" (OuterVolumeSpecName: "scripts") pod "3472112b-9f63-410c-b285-c5a8cd2fa2fc" (UID: "3472112b-9f63-410c-b285-c5a8cd2fa2fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.218952 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3472112b-9f63-410c-b285-c5a8cd2fa2fc" (UID: "3472112b-9f63-410c-b285-c5a8cd2fa2fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.221382 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-config-data" (OuterVolumeSpecName: "config-data") pod "3472112b-9f63-410c-b285-c5a8cd2fa2fc" (UID: "3472112b-9f63-410c-b285-c5a8cd2fa2fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.285539 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.285573 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42pjh\" (UniqueName: \"kubernetes.io/projected/3472112b-9f63-410c-b285-c5a8cd2fa2fc-kube-api-access-42pjh\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.285609 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.285619 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3472112b-9f63-410c-b285-c5a8cd2fa2fc-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.408702 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.588863 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-config\") pod \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.588957 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-ovsdbserver-sb\") pod \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.589080 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg79m\" (UniqueName: \"kubernetes.io/projected/2de204c7-6e5d-4369-abaf-139ec0d2edcb-kube-api-access-mg79m\") pod \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.589150 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-dns-svc\") pod \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.589170 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-ovsdbserver-nb\") pod \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\" (UID: \"2de204c7-6e5d-4369-abaf-139ec0d2edcb\") " Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.595225 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de204c7-6e5d-4369-abaf-139ec0d2edcb-kube-api-access-mg79m" (OuterVolumeSpecName: "kube-api-access-mg79m") pod "2de204c7-6e5d-4369-abaf-139ec0d2edcb" (UID: "2de204c7-6e5d-4369-abaf-139ec0d2edcb"). InnerVolumeSpecName "kube-api-access-mg79m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.624441 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kcbx2" event={"ID":"3472112b-9f63-410c-b285-c5a8cd2fa2fc","Type":"ContainerDied","Data":"f280e8601c14118a24590f9728fd18f82bfe2629b9a621503ee6387a32f0465f"} Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.624734 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f280e8601c14118a24590f9728fd18f82bfe2629b9a621503ee6387a32f0465f" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.624819 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kcbx2" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.634543 4753 generic.go:334] "Generic (PLEG): container finished" podID="2de204c7-6e5d-4369-abaf-139ec0d2edcb" containerID="0ce7dac14ce3c0fb4bc4337c1689abe41e7ddea7d1244b11d052605e935a8ee6" exitCode=0 Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.634766 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb644bd75-f6958" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.635423 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb644bd75-f6958" event={"ID":"2de204c7-6e5d-4369-abaf-139ec0d2edcb","Type":"ContainerDied","Data":"0ce7dac14ce3c0fb4bc4337c1689abe41e7ddea7d1244b11d052605e935a8ee6"} Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.635456 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb644bd75-f6958" event={"ID":"2de204c7-6e5d-4369-abaf-139ec0d2edcb","Type":"ContainerDied","Data":"b003bf45b4713f12c43fd3abd531570474ef3b288d9ffeb4e8a3e924684029df"} Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.635473 4753 scope.go:117] "RemoveContainer" containerID="0ce7dac14ce3c0fb4bc4337c1689abe41e7ddea7d1244b11d052605e935a8ee6" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.668070 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2de204c7-6e5d-4369-abaf-139ec0d2edcb" (UID: "2de204c7-6e5d-4369-abaf-139ec0d2edcb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.672766 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.677865 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2de204c7-6e5d-4369-abaf-139ec0d2edcb" (UID: "2de204c7-6e5d-4369-abaf-139ec0d2edcb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.693681 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg79m\" (UniqueName: \"kubernetes.io/projected/2de204c7-6e5d-4369-abaf-139ec0d2edcb-kube-api-access-mg79m\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.695645 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.695857 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.709088 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-config" (OuterVolumeSpecName: "config") pod "2de204c7-6e5d-4369-abaf-139ec0d2edcb" (UID: "2de204c7-6e5d-4369-abaf-139ec0d2edcb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.728582 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2de204c7-6e5d-4369-abaf-139ec0d2edcb" (UID: "2de204c7-6e5d-4369-abaf-139ec0d2edcb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.782674 4753 scope.go:117] "RemoveContainer" containerID="c0075795fadfaeabc620ae48fbec5fe8534a9f75d7f97f3d13bbb9585d9a2531" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.797446 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.797473 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2de204c7-6e5d-4369-abaf-139ec0d2edcb-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.807481 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.807685 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8a332dcd-7d9c-402c-9560-361e59390857" containerName="nova-api-log" containerID="cri-o://e13d5b7f11e8bd358808a7cacb27e809be0ea56befb2ed72d0e65d4124f9626a" gracePeriod=30 Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.807947 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8a332dcd-7d9c-402c-9560-361e59390857" containerName="nova-api-api" containerID="cri-o://5a6141a781d1d8f165f88e914263568ae13298e5d59c73c43301c6bf27649d46" gracePeriod=30 Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.814303 4753 scope.go:117] "RemoveContainer" containerID="0ce7dac14ce3c0fb4bc4337c1689abe41e7ddea7d1244b11d052605e935a8ee6" Oct 05 20:33:19 crc kubenswrapper[4753]: E1005 20:33:19.817273 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce7dac14ce3c0fb4bc4337c1689abe41e7ddea7d1244b11d052605e935a8ee6\": container with ID starting with 0ce7dac14ce3c0fb4bc4337c1689abe41e7ddea7d1244b11d052605e935a8ee6 not found: ID does not exist" containerID="0ce7dac14ce3c0fb4bc4337c1689abe41e7ddea7d1244b11d052605e935a8ee6" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.817305 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce7dac14ce3c0fb4bc4337c1689abe41e7ddea7d1244b11d052605e935a8ee6"} err="failed to get container status \"0ce7dac14ce3c0fb4bc4337c1689abe41e7ddea7d1244b11d052605e935a8ee6\": rpc error: code = NotFound desc = could not find container \"0ce7dac14ce3c0fb4bc4337c1689abe41e7ddea7d1244b11d052605e935a8ee6\": container with ID starting with 0ce7dac14ce3c0fb4bc4337c1689abe41e7ddea7d1244b11d052605e935a8ee6 not found: ID does not exist" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.817326 4753 scope.go:117] "RemoveContainer" containerID="c0075795fadfaeabc620ae48fbec5fe8534a9f75d7f97f3d13bbb9585d9a2531" Oct 05 20:33:19 crc kubenswrapper[4753]: E1005 20:33:19.820981 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0075795fadfaeabc620ae48fbec5fe8534a9f75d7f97f3d13bbb9585d9a2531\": container with ID starting with c0075795fadfaeabc620ae48fbec5fe8534a9f75d7f97f3d13bbb9585d9a2531 not found: ID does not exist" containerID="c0075795fadfaeabc620ae48fbec5fe8534a9f75d7f97f3d13bbb9585d9a2531" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.821004 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0075795fadfaeabc620ae48fbec5fe8534a9f75d7f97f3d13bbb9585d9a2531"} err="failed to get container status \"c0075795fadfaeabc620ae48fbec5fe8534a9f75d7f97f3d13bbb9585d9a2531\": rpc error: code = NotFound desc = could not find container \"c0075795fadfaeabc620ae48fbec5fe8534a9f75d7f97f3d13bbb9585d9a2531\": container with ID starting with c0075795fadfaeabc620ae48fbec5fe8534a9f75d7f97f3d13bbb9585d9a2531 not found: ID does not exist" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.829464 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8a332dcd-7d9c-402c-9560-361e59390857" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.172:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.829553 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8a332dcd-7d9c-402c-9560-361e59390857" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.172:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 05 20:33:19 crc kubenswrapper[4753]: I1005 20:33:19.862681 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.001057 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bb644bd75-f6958"] Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.013553 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bb644bd75-f6958"] Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.091122 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7qts4" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.211924 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-scripts\") pod \"72e71171-91fe-4161-9899-93934608eaa2\" (UID: \"72e71171-91fe-4161-9899-93934608eaa2\") " Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.212367 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-config-data\") pod \"72e71171-91fe-4161-9899-93934608eaa2\" (UID: \"72e71171-91fe-4161-9899-93934608eaa2\") " Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.212407 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-combined-ca-bundle\") pod \"72e71171-91fe-4161-9899-93934608eaa2\" (UID: \"72e71171-91fe-4161-9899-93934608eaa2\") " Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.212475 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn5dw\" (UniqueName: \"kubernetes.io/projected/72e71171-91fe-4161-9899-93934608eaa2-kube-api-access-xn5dw\") pod \"72e71171-91fe-4161-9899-93934608eaa2\" (UID: \"72e71171-91fe-4161-9899-93934608eaa2\") " Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.218505 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-scripts" (OuterVolumeSpecName: "scripts") pod "72e71171-91fe-4161-9899-93934608eaa2" (UID: "72e71171-91fe-4161-9899-93934608eaa2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.218771 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72e71171-91fe-4161-9899-93934608eaa2-kube-api-access-xn5dw" (OuterVolumeSpecName: "kube-api-access-xn5dw") pod "72e71171-91fe-4161-9899-93934608eaa2" (UID: "72e71171-91fe-4161-9899-93934608eaa2"). InnerVolumeSpecName "kube-api-access-xn5dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.246327 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-config-data" (OuterVolumeSpecName: "config-data") pod "72e71171-91fe-4161-9899-93934608eaa2" (UID: "72e71171-91fe-4161-9899-93934608eaa2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.248901 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72e71171-91fe-4161-9899-93934608eaa2" (UID: "72e71171-91fe-4161-9899-93934608eaa2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.314761 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.315008 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.315076 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72e71171-91fe-4161-9899-93934608eaa2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.315156 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn5dw\" (UniqueName: \"kubernetes.io/projected/72e71171-91fe-4161-9899-93934608eaa2-kube-api-access-xn5dw\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.323307 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.644204 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7qts4" event={"ID":"72e71171-91fe-4161-9899-93934608eaa2","Type":"ContainerDied","Data":"de2759b49b18bc5643bb2b8a5d7f4e597d814b85507fe04865ad19431e5ad114"} Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.644252 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de2759b49b18bc5643bb2b8a5d7f4e597d814b85507fe04865ad19431e5ad114" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.644332 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7qts4" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.690069 4753 generic.go:334] "Generic (PLEG): container finished" podID="8a332dcd-7d9c-402c-9560-361e59390857" containerID="e13d5b7f11e8bd358808a7cacb27e809be0ea56befb2ed72d0e65d4124f9626a" exitCode=143 Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.690466 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a332dcd-7d9c-402c-9560-361e59390857","Type":"ContainerDied","Data":"e13d5b7f11e8bd358808a7cacb27e809be0ea56befb2ed72d0e65d4124f9626a"} Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.690848 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b27db7b7-34b0-4d4d-bdc0-33e3ece8078c" containerName="nova-metadata-log" containerID="cri-o://977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5" gracePeriod=30 Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.691376 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b27db7b7-34b0-4d4d-bdc0-33e3ece8078c" containerName="nova-metadata-metadata" containerID="cri-o://18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3" gracePeriod=30 Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.724968 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 05 20:33:20 crc kubenswrapper[4753]: E1005 20:33:20.725359 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3472112b-9f63-410c-b285-c5a8cd2fa2fc" containerName="nova-manage" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.725376 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3472112b-9f63-410c-b285-c5a8cd2fa2fc" containerName="nova-manage" Oct 05 20:33:20 crc kubenswrapper[4753]: E1005 20:33:20.725388 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de204c7-6e5d-4369-abaf-139ec0d2edcb" containerName="init" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.725395 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de204c7-6e5d-4369-abaf-139ec0d2edcb" containerName="init" Oct 05 20:33:20 crc kubenswrapper[4753]: E1005 20:33:20.725410 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72e71171-91fe-4161-9899-93934608eaa2" containerName="nova-cell1-conductor-db-sync" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.725416 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="72e71171-91fe-4161-9899-93934608eaa2" containerName="nova-cell1-conductor-db-sync" Oct 05 20:33:20 crc kubenswrapper[4753]: E1005 20:33:20.725433 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de204c7-6e5d-4369-abaf-139ec0d2edcb" containerName="dnsmasq-dns" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.725438 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de204c7-6e5d-4369-abaf-139ec0d2edcb" containerName="dnsmasq-dns" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.725605 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="72e71171-91fe-4161-9899-93934608eaa2" containerName="nova-cell1-conductor-db-sync" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.725618 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de204c7-6e5d-4369-abaf-139ec0d2edcb" containerName="dnsmasq-dns" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.725627 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="3472112b-9f63-410c-b285-c5a8cd2fa2fc" containerName="nova-manage" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.727969 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.731029 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.743921 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.825903 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gkbs\" (UniqueName: \"kubernetes.io/projected/ec681529-93c2-4792-8e1e-ccbc696ed9ee-kube-api-access-2gkbs\") pod \"nova-cell1-conductor-0\" (UID: \"ec681529-93c2-4792-8e1e-ccbc696ed9ee\") " pod="openstack/nova-cell1-conductor-0" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.825958 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec681529-93c2-4792-8e1e-ccbc696ed9ee-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ec681529-93c2-4792-8e1e-ccbc696ed9ee\") " pod="openstack/nova-cell1-conductor-0" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.825987 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec681529-93c2-4792-8e1e-ccbc696ed9ee-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ec681529-93c2-4792-8e1e-ccbc696ed9ee\") " pod="openstack/nova-cell1-conductor-0" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.927390 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gkbs\" (UniqueName: \"kubernetes.io/projected/ec681529-93c2-4792-8e1e-ccbc696ed9ee-kube-api-access-2gkbs\") pod \"nova-cell1-conductor-0\" (UID: \"ec681529-93c2-4792-8e1e-ccbc696ed9ee\") " pod="openstack/nova-cell1-conductor-0" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.927436 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec681529-93c2-4792-8e1e-ccbc696ed9ee-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ec681529-93c2-4792-8e1e-ccbc696ed9ee\") " pod="openstack/nova-cell1-conductor-0" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.927460 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec681529-93c2-4792-8e1e-ccbc696ed9ee-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ec681529-93c2-4792-8e1e-ccbc696ed9ee\") " pod="openstack/nova-cell1-conductor-0" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.931958 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec681529-93c2-4792-8e1e-ccbc696ed9ee-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ec681529-93c2-4792-8e1e-ccbc696ed9ee\") " pod="openstack/nova-cell1-conductor-0" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.933664 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec681529-93c2-4792-8e1e-ccbc696ed9ee-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ec681529-93c2-4792-8e1e-ccbc696ed9ee\") " pod="openstack/nova-cell1-conductor-0" Oct 05 20:33:20 crc kubenswrapper[4753]: I1005 20:33:20.953015 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gkbs\" (UniqueName: \"kubernetes.io/projected/ec681529-93c2-4792-8e1e-ccbc696ed9ee-kube-api-access-2gkbs\") pod \"nova-cell1-conductor-0\" (UID: \"ec681529-93c2-4792-8e1e-ccbc696ed9ee\") " pod="openstack/nova-cell1-conductor-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.053279 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.357244 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.433831 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.436441 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-config-data\") pod \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.436840 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-logs\") pod \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.436919 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-nova-metadata-tls-certs\") pod \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.436985 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8c5c\" (UniqueName: \"kubernetes.io/projected/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-kube-api-access-v8c5c\") pod \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.437022 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-combined-ca-bundle\") pod \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\" (UID: \"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c\") " Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.438362 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-logs" (OuterVolumeSpecName: "logs") pod "b27db7b7-34b0-4d4d-bdc0-33e3ece8078c" (UID: "b27db7b7-34b0-4d4d-bdc0-33e3ece8078c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.441023 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-kube-api-access-v8c5c" (OuterVolumeSpecName: "kube-api-access-v8c5c") pod "b27db7b7-34b0-4d4d-bdc0-33e3ece8078c" (UID: "b27db7b7-34b0-4d4d-bdc0-33e3ece8078c"). InnerVolumeSpecName "kube-api-access-v8c5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.467354 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b27db7b7-34b0-4d4d-bdc0-33e3ece8078c" (UID: "b27db7b7-34b0-4d4d-bdc0-33e3ece8078c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.468015 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-config-data" (OuterVolumeSpecName: "config-data") pod "b27db7b7-34b0-4d4d-bdc0-33e3ece8078c" (UID: "b27db7b7-34b0-4d4d-bdc0-33e3ece8078c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.515319 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b27db7b7-34b0-4d4d-bdc0-33e3ece8078c" (UID: "b27db7b7-34b0-4d4d-bdc0-33e3ece8078c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.540467 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-logs\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.540535 4753 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.540574 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8c5c\" (UniqueName: \"kubernetes.io/projected/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-kube-api-access-v8c5c\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.540607 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.540617 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.699557 4753 generic.go:334] "Generic (PLEG): container finished" podID="b27db7b7-34b0-4d4d-bdc0-33e3ece8078c" containerID="18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3" exitCode=0 Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.699835 4753 generic.go:334] "Generic (PLEG): container finished" podID="b27db7b7-34b0-4d4d-bdc0-33e3ece8078c" containerID="977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5" exitCode=143 Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.699650 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.699651 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c","Type":"ContainerDied","Data":"18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3"} Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.699899 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c","Type":"ContainerDied","Data":"977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5"} Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.699913 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b27db7b7-34b0-4d4d-bdc0-33e3ece8078c","Type":"ContainerDied","Data":"4c1f7c1a67e4cce4d94ef78cb3750a53f1422a961ad8eced4433e2315da9d485"} Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.699928 4753 scope.go:117] "RemoveContainer" containerID="18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.702349 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ec681529-93c2-4792-8e1e-ccbc696ed9ee","Type":"ContainerStarted","Data":"ae45fbd714d4aadef3abae857088d1a6c46dfe31a19982aeb5fcb6c59172305d"} Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.702396 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ec681529-93c2-4792-8e1e-ccbc696ed9ee","Type":"ContainerStarted","Data":"16c08692861e2fd7c7c71621f5c9e97e711f0dd038af92a256d5ab5e07a09c37"} Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.702443 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4daa75de-d231-4306-bc81-a5b6a77df4ff" containerName="nova-scheduler-scheduler" containerID="cri-o://884259d45d596d1afa1c7ab8d82e648224698915c6bbab555e6afb2b8f9048e3" gracePeriod=30 Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.720969 4753 scope.go:117] "RemoveContainer" containerID="977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.732965 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.732946249 podStartE2EDuration="1.732946249s" podCreationTimestamp="2025-10-05 20:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:33:21.729532642 +0000 UTC m=+1110.577860894" watchObservedRunningTime="2025-10-05 20:33:21.732946249 +0000 UTC m=+1110.581274481" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.744312 4753 scope.go:117] "RemoveContainer" containerID="18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3" Oct 05 20:33:21 crc kubenswrapper[4753]: E1005 20:33:21.750383 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3\": container with ID starting with 18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3 not found: ID does not exist" containerID="18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.750427 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3"} err="failed to get container status \"18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3\": rpc error: code = NotFound desc = could not find container \"18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3\": container with ID starting with 18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3 not found: ID does not exist" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.750454 4753 scope.go:117] "RemoveContainer" containerID="977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5" Oct 05 20:33:21 crc kubenswrapper[4753]: E1005 20:33:21.752998 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5\": container with ID starting with 977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5 not found: ID does not exist" containerID="977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.753028 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5"} err="failed to get container status \"977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5\": rpc error: code = NotFound desc = could not find container \"977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5\": container with ID starting with 977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5 not found: ID does not exist" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.753045 4753 scope.go:117] "RemoveContainer" containerID="18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.753403 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3"} err="failed to get container status \"18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3\": rpc error: code = NotFound desc = could not find container \"18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3\": container with ID starting with 18b54b92a54b3123693919d7402133ef18f80ab7c3e0273a64295aca1f7b45c3 not found: ID does not exist" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.753427 4753 scope.go:117] "RemoveContainer" containerID="977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.753605 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5"} err="failed to get container status \"977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5\": rpc error: code = NotFound desc = could not find container \"977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5\": container with ID starting with 977695f056bcd79c221c146eff0dcadab61b8e40b0b64a4ca5995724e2c19bc5 not found: ID does not exist" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.760403 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.772184 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.779829 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:33:21 crc kubenswrapper[4753]: E1005 20:33:21.780234 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b27db7b7-34b0-4d4d-bdc0-33e3ece8078c" containerName="nova-metadata-log" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.780252 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27db7b7-34b0-4d4d-bdc0-33e3ece8078c" containerName="nova-metadata-log" Oct 05 20:33:21 crc kubenswrapper[4753]: E1005 20:33:21.780270 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b27db7b7-34b0-4d4d-bdc0-33e3ece8078c" containerName="nova-metadata-metadata" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.780276 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27db7b7-34b0-4d4d-bdc0-33e3ece8078c" containerName="nova-metadata-metadata" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.780463 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b27db7b7-34b0-4d4d-bdc0-33e3ece8078c" containerName="nova-metadata-log" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.780487 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b27db7b7-34b0-4d4d-bdc0-33e3ece8078c" containerName="nova-metadata-metadata" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.781386 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.788601 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.788681 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.799378 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.844615 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfrwf\" (UniqueName: \"kubernetes.io/projected/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-kube-api-access-lfrwf\") pod \"nova-metadata-0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.844677 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-logs\") pod \"nova-metadata-0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.844693 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.844840 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-config-data\") pod \"nova-metadata-0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.844977 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.865372 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de204c7-6e5d-4369-abaf-139ec0d2edcb" path="/var/lib/kubelet/pods/2de204c7-6e5d-4369-abaf-139ec0d2edcb/volumes" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.865970 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b27db7b7-34b0-4d4d-bdc0-33e3ece8078c" path="/var/lib/kubelet/pods/b27db7b7-34b0-4d4d-bdc0-33e3ece8078c/volumes" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.947086 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.948235 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfrwf\" (UniqueName: \"kubernetes.io/projected/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-kube-api-access-lfrwf\") pod \"nova-metadata-0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.948334 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-logs\") pod \"nova-metadata-0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.948351 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.948385 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-config-data\") pod \"nova-metadata-0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.948878 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-logs\") pod \"nova-metadata-0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.951989 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-config-data\") pod \"nova-metadata-0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.952593 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.955291 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " pod="openstack/nova-metadata-0" Oct 05 20:33:21 crc kubenswrapper[4753]: I1005 20:33:21.969646 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfrwf\" (UniqueName: \"kubernetes.io/projected/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-kube-api-access-lfrwf\") pod \"nova-metadata-0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " pod="openstack/nova-metadata-0" Oct 05 20:33:22 crc kubenswrapper[4753]: I1005 20:33:22.100238 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 05 20:33:22 crc kubenswrapper[4753]: I1005 20:33:22.545905 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:33:22 crc kubenswrapper[4753]: W1005 20:33:22.551211 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8afbf63_cd8b_483a_a533_ccddc2c3ebc0.slice/crio-61675852914e162a8bbcd198ca0726078d8c460e10beb8c21a0239fb5dbf7393 WatchSource:0}: Error finding container 61675852914e162a8bbcd198ca0726078d8c460e10beb8c21a0239fb5dbf7393: Status 404 returned error can't find the container with id 61675852914e162a8bbcd198ca0726078d8c460e10beb8c21a0239fb5dbf7393 Oct 05 20:33:22 crc kubenswrapper[4753]: I1005 20:33:22.729213 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0","Type":"ContainerStarted","Data":"ec4418cbdb10776cb511fea8368349ae7e960bec17a95b7371c061ab2a4d4eee"} Oct 05 20:33:22 crc kubenswrapper[4753]: I1005 20:33:22.729552 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0","Type":"ContainerStarted","Data":"61675852914e162a8bbcd198ca0726078d8c460e10beb8c21a0239fb5dbf7393"} Oct 05 20:33:22 crc kubenswrapper[4753]: I1005 20:33:22.735981 4753 generic.go:334] "Generic (PLEG): container finished" podID="4daa75de-d231-4306-bc81-a5b6a77df4ff" containerID="884259d45d596d1afa1c7ab8d82e648224698915c6bbab555e6afb2b8f9048e3" exitCode=0 Oct 05 20:33:22 crc kubenswrapper[4753]: I1005 20:33:22.737116 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4daa75de-d231-4306-bc81-a5b6a77df4ff","Type":"ContainerDied","Data":"884259d45d596d1afa1c7ab8d82e648224698915c6bbab555e6afb2b8f9048e3"} Oct 05 20:33:22 crc kubenswrapper[4753]: I1005 20:33:22.737175 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 05 20:33:22 crc kubenswrapper[4753]: I1005 20:33:22.863592 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 05 20:33:22 crc kubenswrapper[4753]: I1005 20:33:22.961890 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4daa75de-d231-4306-bc81-a5b6a77df4ff-combined-ca-bundle\") pod \"4daa75de-d231-4306-bc81-a5b6a77df4ff\" (UID: \"4daa75de-d231-4306-bc81-a5b6a77df4ff\") " Oct 05 20:33:22 crc kubenswrapper[4753]: I1005 20:33:22.962108 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjhkd\" (UniqueName: \"kubernetes.io/projected/4daa75de-d231-4306-bc81-a5b6a77df4ff-kube-api-access-vjhkd\") pod \"4daa75de-d231-4306-bc81-a5b6a77df4ff\" (UID: \"4daa75de-d231-4306-bc81-a5b6a77df4ff\") " Oct 05 20:33:22 crc kubenswrapper[4753]: I1005 20:33:22.962210 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4daa75de-d231-4306-bc81-a5b6a77df4ff-config-data\") pod \"4daa75de-d231-4306-bc81-a5b6a77df4ff\" (UID: \"4daa75de-d231-4306-bc81-a5b6a77df4ff\") " Oct 05 20:33:22 crc kubenswrapper[4753]: I1005 20:33:22.982122 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4daa75de-d231-4306-bc81-a5b6a77df4ff-kube-api-access-vjhkd" (OuterVolumeSpecName: "kube-api-access-vjhkd") pod "4daa75de-d231-4306-bc81-a5b6a77df4ff" (UID: "4daa75de-d231-4306-bc81-a5b6a77df4ff"). InnerVolumeSpecName "kube-api-access-vjhkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:33:22 crc kubenswrapper[4753]: I1005 20:33:22.989979 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4daa75de-d231-4306-bc81-a5b6a77df4ff-config-data" (OuterVolumeSpecName: "config-data") pod "4daa75de-d231-4306-bc81-a5b6a77df4ff" (UID: "4daa75de-d231-4306-bc81-a5b6a77df4ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:22 crc kubenswrapper[4753]: I1005 20:33:22.991956 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4daa75de-d231-4306-bc81-a5b6a77df4ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4daa75de-d231-4306-bc81-a5b6a77df4ff" (UID: "4daa75de-d231-4306-bc81-a5b6a77df4ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.064666 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjhkd\" (UniqueName: \"kubernetes.io/projected/4daa75de-d231-4306-bc81-a5b6a77df4ff-kube-api-access-vjhkd\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.064700 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4daa75de-d231-4306-bc81-a5b6a77df4ff-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.064714 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4daa75de-d231-4306-bc81-a5b6a77df4ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.745625 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0","Type":"ContainerStarted","Data":"0b32bba98d152b9756a5785b8f17a9dd6b7f69f421f0f42cdf885600ed3b23b7"} Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.747402 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.747439 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4daa75de-d231-4306-bc81-a5b6a77df4ff","Type":"ContainerDied","Data":"e74cf2d59c74e2261a39377b3cf5acac24ed7ecb822588ab08b888455d31091b"} Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.747481 4753 scope.go:117] "RemoveContainer" containerID="884259d45d596d1afa1c7ab8d82e648224698915c6bbab555e6afb2b8f9048e3" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.765962 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.76594343 podStartE2EDuration="2.76594343s" podCreationTimestamp="2025-10-05 20:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:33:23.763617567 +0000 UTC m=+1112.611945819" watchObservedRunningTime="2025-10-05 20:33:23.76594343 +0000 UTC m=+1112.614271672" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.786410 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.806717 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.822016 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 05 20:33:23 crc kubenswrapper[4753]: E1005 20:33:23.822494 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4daa75de-d231-4306-bc81-a5b6a77df4ff" containerName="nova-scheduler-scheduler" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.822518 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4daa75de-d231-4306-bc81-a5b6a77df4ff" containerName="nova-scheduler-scheduler" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.822719 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="4daa75de-d231-4306-bc81-a5b6a77df4ff" containerName="nova-scheduler-scheduler" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.823427 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.827509 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.827843 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.875737 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4daa75de-d231-4306-bc81-a5b6a77df4ff" path="/var/lib/kubelet/pods/4daa75de-d231-4306-bc81-a5b6a77df4ff/volumes" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.876350 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87d65699-7e60-444b-aefb-d5d80bf24404-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"87d65699-7e60-444b-aefb-d5d80bf24404\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.876431 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87d65699-7e60-444b-aefb-d5d80bf24404-config-data\") pod \"nova-scheduler-0\" (UID: \"87d65699-7e60-444b-aefb-d5d80bf24404\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.876496 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62zcb\" (UniqueName: \"kubernetes.io/projected/87d65699-7e60-444b-aefb-d5d80bf24404-kube-api-access-62zcb\") pod \"nova-scheduler-0\" (UID: \"87d65699-7e60-444b-aefb-d5d80bf24404\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.978430 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62zcb\" (UniqueName: \"kubernetes.io/projected/87d65699-7e60-444b-aefb-d5d80bf24404-kube-api-access-62zcb\") pod \"nova-scheduler-0\" (UID: \"87d65699-7e60-444b-aefb-d5d80bf24404\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.978557 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87d65699-7e60-444b-aefb-d5d80bf24404-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"87d65699-7e60-444b-aefb-d5d80bf24404\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.978638 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87d65699-7e60-444b-aefb-d5d80bf24404-config-data\") pod \"nova-scheduler-0\" (UID: \"87d65699-7e60-444b-aefb-d5d80bf24404\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.983451 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87d65699-7e60-444b-aefb-d5d80bf24404-config-data\") pod \"nova-scheduler-0\" (UID: \"87d65699-7e60-444b-aefb-d5d80bf24404\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.993706 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62zcb\" (UniqueName: \"kubernetes.io/projected/87d65699-7e60-444b-aefb-d5d80bf24404-kube-api-access-62zcb\") pod \"nova-scheduler-0\" (UID: \"87d65699-7e60-444b-aefb-d5d80bf24404\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:23 crc kubenswrapper[4753]: I1005 20:33:23.994377 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87d65699-7e60-444b-aefb-d5d80bf24404-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"87d65699-7e60-444b-aefb-d5d80bf24404\") " pod="openstack/nova-scheduler-0" Oct 05 20:33:24 crc kubenswrapper[4753]: I1005 20:33:24.176893 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 05 20:33:24 crc kubenswrapper[4753]: I1005 20:33:24.651913 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 05 20:33:24 crc kubenswrapper[4753]: I1005 20:33:24.759626 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87d65699-7e60-444b-aefb-d5d80bf24404","Type":"ContainerStarted","Data":"42fdcedd0e1d16be753237d8a832586f56601edd3e89b647a08b1334d907b401"} Oct 05 20:33:25 crc kubenswrapper[4753]: I1005 20:33:25.775621 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87d65699-7e60-444b-aefb-d5d80bf24404","Type":"ContainerStarted","Data":"3d2bc67b44b4e12fa9d832b6c88ff29a0d016ca2468f8307fbf8895b770a6ec5"} Oct 05 20:33:25 crc kubenswrapper[4753]: I1005 20:33:25.800404 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.800212522 podStartE2EDuration="2.800212522s" podCreationTimestamp="2025-10-05 20:33:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:33:25.790426626 +0000 UTC m=+1114.638754868" watchObservedRunningTime="2025-10-05 20:33:25.800212522 +0000 UTC m=+1114.648540754" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.083756 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.775782 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.783001 4753 generic.go:334] "Generic (PLEG): container finished" podID="8a332dcd-7d9c-402c-9560-361e59390857" containerID="5a6141a781d1d8f165f88e914263568ae13298e5d59c73c43301c6bf27649d46" exitCode=0 Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.783739 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.783893 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a332dcd-7d9c-402c-9560-361e59390857","Type":"ContainerDied","Data":"5a6141a781d1d8f165f88e914263568ae13298e5d59c73c43301c6bf27649d46"} Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.783922 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a332dcd-7d9c-402c-9560-361e59390857","Type":"ContainerDied","Data":"e7b48bddffb0a53dc7c20d091ef3f285e91d3169ff7948b817ae23ee8fbc4bed"} Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.783944 4753 scope.go:117] "RemoveContainer" containerID="5a6141a781d1d8f165f88e914263568ae13298e5d59c73c43301c6bf27649d46" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.802958 4753 scope.go:117] "RemoveContainer" containerID="e13d5b7f11e8bd358808a7cacb27e809be0ea56befb2ed72d0e65d4124f9626a" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.831087 4753 scope.go:117] "RemoveContainer" containerID="5a6141a781d1d8f165f88e914263568ae13298e5d59c73c43301c6bf27649d46" Oct 05 20:33:26 crc kubenswrapper[4753]: E1005 20:33:26.831642 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a6141a781d1d8f165f88e914263568ae13298e5d59c73c43301c6bf27649d46\": container with ID starting with 5a6141a781d1d8f165f88e914263568ae13298e5d59c73c43301c6bf27649d46 not found: ID does not exist" containerID="5a6141a781d1d8f165f88e914263568ae13298e5d59c73c43301c6bf27649d46" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.831694 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a6141a781d1d8f165f88e914263568ae13298e5d59c73c43301c6bf27649d46"} err="failed to get container status \"5a6141a781d1d8f165f88e914263568ae13298e5d59c73c43301c6bf27649d46\": rpc error: code = NotFound desc = could not find container \"5a6141a781d1d8f165f88e914263568ae13298e5d59c73c43301c6bf27649d46\": container with ID starting with 5a6141a781d1d8f165f88e914263568ae13298e5d59c73c43301c6bf27649d46 not found: ID does not exist" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.831724 4753 scope.go:117] "RemoveContainer" containerID="e13d5b7f11e8bd358808a7cacb27e809be0ea56befb2ed72d0e65d4124f9626a" Oct 05 20:33:26 crc kubenswrapper[4753]: E1005 20:33:26.832036 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e13d5b7f11e8bd358808a7cacb27e809be0ea56befb2ed72d0e65d4124f9626a\": container with ID starting with e13d5b7f11e8bd358808a7cacb27e809be0ea56befb2ed72d0e65d4124f9626a not found: ID does not exist" containerID="e13d5b7f11e8bd358808a7cacb27e809be0ea56befb2ed72d0e65d4124f9626a" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.832074 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e13d5b7f11e8bd358808a7cacb27e809be0ea56befb2ed72d0e65d4124f9626a"} err="failed to get container status \"e13d5b7f11e8bd358808a7cacb27e809be0ea56befb2ed72d0e65d4124f9626a\": rpc error: code = NotFound desc = could not find container \"e13d5b7f11e8bd358808a7cacb27e809be0ea56befb2ed72d0e65d4124f9626a\": container with ID starting with e13d5b7f11e8bd358808a7cacb27e809be0ea56befb2ed72d0e65d4124f9626a not found: ID does not exist" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.849443 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhtz4\" (UniqueName: \"kubernetes.io/projected/8a332dcd-7d9c-402c-9560-361e59390857-kube-api-access-zhtz4\") pod \"8a332dcd-7d9c-402c-9560-361e59390857\" (UID: \"8a332dcd-7d9c-402c-9560-361e59390857\") " Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.849516 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a332dcd-7d9c-402c-9560-361e59390857-config-data\") pod \"8a332dcd-7d9c-402c-9560-361e59390857\" (UID: \"8a332dcd-7d9c-402c-9560-361e59390857\") " Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.849585 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a332dcd-7d9c-402c-9560-361e59390857-logs\") pod \"8a332dcd-7d9c-402c-9560-361e59390857\" (UID: \"8a332dcd-7d9c-402c-9560-361e59390857\") " Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.849679 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a332dcd-7d9c-402c-9560-361e59390857-combined-ca-bundle\") pod \"8a332dcd-7d9c-402c-9560-361e59390857\" (UID: \"8a332dcd-7d9c-402c-9560-361e59390857\") " Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.850343 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a332dcd-7d9c-402c-9560-361e59390857-logs" (OuterVolumeSpecName: "logs") pod "8a332dcd-7d9c-402c-9560-361e59390857" (UID: "8a332dcd-7d9c-402c-9560-361e59390857"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.855703 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a332dcd-7d9c-402c-9560-361e59390857-kube-api-access-zhtz4" (OuterVolumeSpecName: "kube-api-access-zhtz4") pod "8a332dcd-7d9c-402c-9560-361e59390857" (UID: "8a332dcd-7d9c-402c-9560-361e59390857"). InnerVolumeSpecName "kube-api-access-zhtz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.877856 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a332dcd-7d9c-402c-9560-361e59390857-config-data" (OuterVolumeSpecName: "config-data") pod "8a332dcd-7d9c-402c-9560-361e59390857" (UID: "8a332dcd-7d9c-402c-9560-361e59390857"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.886272 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a332dcd-7d9c-402c-9560-361e59390857-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a332dcd-7d9c-402c-9560-361e59390857" (UID: "8a332dcd-7d9c-402c-9560-361e59390857"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.953304 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhtz4\" (UniqueName: \"kubernetes.io/projected/8a332dcd-7d9c-402c-9560-361e59390857-kube-api-access-zhtz4\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.953334 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a332dcd-7d9c-402c-9560-361e59390857-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.953343 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a332dcd-7d9c-402c-9560-361e59390857-logs\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:26 crc kubenswrapper[4753]: I1005 20:33:26.953354 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a332dcd-7d9c-402c-9560-361e59390857-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.101291 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.101333 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.114756 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.123258 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.146811 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 05 20:33:27 crc kubenswrapper[4753]: E1005 20:33:27.147387 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a332dcd-7d9c-402c-9560-361e59390857" containerName="nova-api-api" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.147418 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a332dcd-7d9c-402c-9560-361e59390857" containerName="nova-api-api" Oct 05 20:33:27 crc kubenswrapper[4753]: E1005 20:33:27.147457 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a332dcd-7d9c-402c-9560-361e59390857" containerName="nova-api-log" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.147465 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a332dcd-7d9c-402c-9560-361e59390857" containerName="nova-api-log" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.147690 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a332dcd-7d9c-402c-9560-361e59390857" containerName="nova-api-api" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.147732 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a332dcd-7d9c-402c-9560-361e59390857" containerName="nova-api-log" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.148959 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.151101 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.155780 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ee31d4e-f57f-4b98-89f8-343679496901-config-data\") pod \"nova-api-0\" (UID: \"6ee31d4e-f57f-4b98-89f8-343679496901\") " pod="openstack/nova-api-0" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.155884 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jtng\" (UniqueName: \"kubernetes.io/projected/6ee31d4e-f57f-4b98-89f8-343679496901-kube-api-access-9jtng\") pod \"nova-api-0\" (UID: \"6ee31d4e-f57f-4b98-89f8-343679496901\") " pod="openstack/nova-api-0" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.155988 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ee31d4e-f57f-4b98-89f8-343679496901-logs\") pod \"nova-api-0\" (UID: \"6ee31d4e-f57f-4b98-89f8-343679496901\") " pod="openstack/nova-api-0" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.156099 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ee31d4e-f57f-4b98-89f8-343679496901-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6ee31d4e-f57f-4b98-89f8-343679496901\") " pod="openstack/nova-api-0" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.164342 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.258240 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jtng\" (UniqueName: \"kubernetes.io/projected/6ee31d4e-f57f-4b98-89f8-343679496901-kube-api-access-9jtng\") pod \"nova-api-0\" (UID: \"6ee31d4e-f57f-4b98-89f8-343679496901\") " pod="openstack/nova-api-0" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.258560 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ee31d4e-f57f-4b98-89f8-343679496901-logs\") pod \"nova-api-0\" (UID: \"6ee31d4e-f57f-4b98-89f8-343679496901\") " pod="openstack/nova-api-0" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.258596 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ee31d4e-f57f-4b98-89f8-343679496901-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6ee31d4e-f57f-4b98-89f8-343679496901\") " pod="openstack/nova-api-0" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.258685 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ee31d4e-f57f-4b98-89f8-343679496901-config-data\") pod \"nova-api-0\" (UID: \"6ee31d4e-f57f-4b98-89f8-343679496901\") " pod="openstack/nova-api-0" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.258920 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ee31d4e-f57f-4b98-89f8-343679496901-logs\") pod \"nova-api-0\" (UID: \"6ee31d4e-f57f-4b98-89f8-343679496901\") " pod="openstack/nova-api-0" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.262904 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ee31d4e-f57f-4b98-89f8-343679496901-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6ee31d4e-f57f-4b98-89f8-343679496901\") " pod="openstack/nova-api-0" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.263024 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ee31d4e-f57f-4b98-89f8-343679496901-config-data\") pod \"nova-api-0\" (UID: \"6ee31d4e-f57f-4b98-89f8-343679496901\") " pod="openstack/nova-api-0" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.276317 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jtng\" (UniqueName: \"kubernetes.io/projected/6ee31d4e-f57f-4b98-89f8-343679496901-kube-api-access-9jtng\") pod \"nova-api-0\" (UID: \"6ee31d4e-f57f-4b98-89f8-343679496901\") " pod="openstack/nova-api-0" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.476887 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.863968 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a332dcd-7d9c-402c-9560-361e59390857" path="/var/lib/kubelet/pods/8a332dcd-7d9c-402c-9560-361e59390857/volumes" Oct 05 20:33:27 crc kubenswrapper[4753]: I1005 20:33:27.989609 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:33:28 crc kubenswrapper[4753]: I1005 20:33:28.800947 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6ee31d4e-f57f-4b98-89f8-343679496901","Type":"ContainerStarted","Data":"67a86a61bfee73d54d1c41c85d384c50fbd39cef8867216a65775eae05b718d5"} Oct 05 20:33:28 crc kubenswrapper[4753]: I1005 20:33:28.801013 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6ee31d4e-f57f-4b98-89f8-343679496901","Type":"ContainerStarted","Data":"db1589ed139a8f143b3b0098caa03a5ed5405c0811b32140129d3cd4cf32dfe3"} Oct 05 20:33:28 crc kubenswrapper[4753]: I1005 20:33:28.801026 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6ee31d4e-f57f-4b98-89f8-343679496901","Type":"ContainerStarted","Data":"d243ebfa12f509b36a4da0d5c05a33dd6f4cf3a5c68e859495a701c0b0c2abf1"} Oct 05 20:33:28 crc kubenswrapper[4753]: I1005 20:33:28.819655 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.8196356580000002 podStartE2EDuration="1.819635658s" podCreationTimestamp="2025-10-05 20:33:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:33:28.817374108 +0000 UTC m=+1117.665702340" watchObservedRunningTime="2025-10-05 20:33:28.819635658 +0000 UTC m=+1117.667963890" Oct 05 20:33:29 crc kubenswrapper[4753]: I1005 20:33:29.177552 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 05 20:33:30 crc kubenswrapper[4753]: I1005 20:33:30.573848 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 05 20:33:32 crc kubenswrapper[4753]: I1005 20:33:32.101039 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 05 20:33:32 crc kubenswrapper[4753]: I1005 20:33:32.102235 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 05 20:33:33 crc kubenswrapper[4753]: I1005 20:33:33.118349 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 05 20:33:33 crc kubenswrapper[4753]: I1005 20:33:33.118378 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 05 20:33:33 crc kubenswrapper[4753]: I1005 20:33:33.140259 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 05 20:33:33 crc kubenswrapper[4753]: I1005 20:33:33.845250 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5d0d97ee-e7c7-4f1c-b232-b6377a0c890f" containerName="kube-state-metrics" containerID="cri-o://f96fabe284eb93f7b4e6d91af5e8dfce1b504522093955701a5a1f520c479d1f" gracePeriod=30 Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.178360 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.218440 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.318756 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.361314 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.361761 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerName="ceilometer-central-agent" containerID="cri-o://5eb60247cb3ef6af8280a13e7f38dcec1294e0cd49b0053d1ece17713f412310" gracePeriod=30 Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.361915 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerName="proxy-httpd" containerID="cri-o://479a2c9e2d4320203219f6ce98880f91a2d4c825cd0a5eaa5c85ff06f32d8bec" gracePeriod=30 Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.361969 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerName="sg-core" containerID="cri-o://24888d45a9c6e3091f5b36f909ec1d4356eafe88accf60937c58f1eb4b5a59db" gracePeriod=30 Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.362011 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerName="ceilometer-notification-agent" containerID="cri-o://375a2c45be80c2b18c8e6e41b91ad8e21299a50d1ecc3e966854ec9b80d208f3" gracePeriod=30 Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.510127 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r249z\" (UniqueName: \"kubernetes.io/projected/5d0d97ee-e7c7-4f1c-b232-b6377a0c890f-kube-api-access-r249z\") pod \"5d0d97ee-e7c7-4f1c-b232-b6377a0c890f\" (UID: \"5d0d97ee-e7c7-4f1c-b232-b6377a0c890f\") " Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.522568 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d0d97ee-e7c7-4f1c-b232-b6377a0c890f-kube-api-access-r249z" (OuterVolumeSpecName: "kube-api-access-r249z") pod "5d0d97ee-e7c7-4f1c-b232-b6377a0c890f" (UID: "5d0d97ee-e7c7-4f1c-b232-b6377a0c890f"). InnerVolumeSpecName "kube-api-access-r249z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.613386 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r249z\" (UniqueName: \"kubernetes.io/projected/5d0d97ee-e7c7-4f1c-b232-b6377a0c890f-kube-api-access-r249z\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.859598 4753 generic.go:334] "Generic (PLEG): container finished" podID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerID="479a2c9e2d4320203219f6ce98880f91a2d4c825cd0a5eaa5c85ff06f32d8bec" exitCode=0 Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.859632 4753 generic.go:334] "Generic (PLEG): container finished" podID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerID="24888d45a9c6e3091f5b36f909ec1d4356eafe88accf60937c58f1eb4b5a59db" exitCode=2 Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.859642 4753 generic.go:334] "Generic (PLEG): container finished" podID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerID="375a2c45be80c2b18c8e6e41b91ad8e21299a50d1ecc3e966854ec9b80d208f3" exitCode=0 Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.859652 4753 generic.go:334] "Generic (PLEG): container finished" podID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerID="5eb60247cb3ef6af8280a13e7f38dcec1294e0cd49b0053d1ece17713f412310" exitCode=0 Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.859657 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"072980fd-9fc2-464a-8c86-4201a00a3bae","Type":"ContainerDied","Data":"479a2c9e2d4320203219f6ce98880f91a2d4c825cd0a5eaa5c85ff06f32d8bec"} Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.859696 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"072980fd-9fc2-464a-8c86-4201a00a3bae","Type":"ContainerDied","Data":"24888d45a9c6e3091f5b36f909ec1d4356eafe88accf60937c58f1eb4b5a59db"} Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.859705 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"072980fd-9fc2-464a-8c86-4201a00a3bae","Type":"ContainerDied","Data":"375a2c45be80c2b18c8e6e41b91ad8e21299a50d1ecc3e966854ec9b80d208f3"} Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.859716 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"072980fd-9fc2-464a-8c86-4201a00a3bae","Type":"ContainerDied","Data":"5eb60247cb3ef6af8280a13e7f38dcec1294e0cd49b0053d1ece17713f412310"} Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.860734 4753 generic.go:334] "Generic (PLEG): container finished" podID="5d0d97ee-e7c7-4f1c-b232-b6377a0c890f" containerID="f96fabe284eb93f7b4e6d91af5e8dfce1b504522093955701a5a1f520c479d1f" exitCode=2 Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.860757 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5d0d97ee-e7c7-4f1c-b232-b6377a0c890f","Type":"ContainerDied","Data":"f96fabe284eb93f7b4e6d91af5e8dfce1b504522093955701a5a1f520c479d1f"} Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.860782 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5d0d97ee-e7c7-4f1c-b232-b6377a0c890f","Type":"ContainerDied","Data":"c26c915c793fecef25e34620e16738fe40c9f879a44b5508ab01260d3b2869f0"} Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.860800 4753 scope.go:117] "RemoveContainer" containerID="f96fabe284eb93f7b4e6d91af5e8dfce1b504522093955701a5a1f520c479d1f" Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.860801 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.909899 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.910646 4753 scope.go:117] "RemoveContainer" containerID="f96fabe284eb93f7b4e6d91af5e8dfce1b504522093955701a5a1f520c479d1f" Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.910883 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 05 20:33:34 crc kubenswrapper[4753]: E1005 20:33:34.912698 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f96fabe284eb93f7b4e6d91af5e8dfce1b504522093955701a5a1f520c479d1f\": container with ID starting with f96fabe284eb93f7b4e6d91af5e8dfce1b504522093955701a5a1f520c479d1f not found: ID does not exist" containerID="f96fabe284eb93f7b4e6d91af5e8dfce1b504522093955701a5a1f520c479d1f" Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.912734 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f96fabe284eb93f7b4e6d91af5e8dfce1b504522093955701a5a1f520c479d1f"} err="failed to get container status \"f96fabe284eb93f7b4e6d91af5e8dfce1b504522093955701a5a1f520c479d1f\": rpc error: code = NotFound desc = could not find container \"f96fabe284eb93f7b4e6d91af5e8dfce1b504522093955701a5a1f520c479d1f\": container with ID starting with f96fabe284eb93f7b4e6d91af5e8dfce1b504522093955701a5a1f520c479d1f not found: ID does not exist" Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.923968 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.944503 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 05 20:33:34 crc kubenswrapper[4753]: E1005 20:33:34.944966 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0d97ee-e7c7-4f1c-b232-b6377a0c890f" containerName="kube-state-metrics" Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.944983 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0d97ee-e7c7-4f1c-b232-b6377a0c890f" containerName="kube-state-metrics" Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.945197 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d0d97ee-e7c7-4f1c-b232-b6377a0c890f" containerName="kube-state-metrics" Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.945794 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.949197 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.949716 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 05 20:33:34 crc kubenswrapper[4753]: I1005 20:33:34.961069 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.021520 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5dd5b3b0-432b-4040-8544-d68497fca1de-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5dd5b3b0-432b-4040-8544-d68497fca1de\") " pod="openstack/kube-state-metrics-0" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.021567 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd5b3b0-432b-4040-8544-d68497fca1de-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5dd5b3b0-432b-4040-8544-d68497fca1de\") " pod="openstack/kube-state-metrics-0" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.021633 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdzw9\" (UniqueName: \"kubernetes.io/projected/5dd5b3b0-432b-4040-8544-d68497fca1de-kube-api-access-zdzw9\") pod \"kube-state-metrics-0\" (UID: \"5dd5b3b0-432b-4040-8544-d68497fca1de\") " pod="openstack/kube-state-metrics-0" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.021658 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dd5b3b0-432b-4040-8544-d68497fca1de-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5dd5b3b0-432b-4040-8544-d68497fca1de\") " pod="openstack/kube-state-metrics-0" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.124283 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd5b3b0-432b-4040-8544-d68497fca1de-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5dd5b3b0-432b-4040-8544-d68497fca1de\") " pod="openstack/kube-state-metrics-0" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.124388 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdzw9\" (UniqueName: \"kubernetes.io/projected/5dd5b3b0-432b-4040-8544-d68497fca1de-kube-api-access-zdzw9\") pod \"kube-state-metrics-0\" (UID: \"5dd5b3b0-432b-4040-8544-d68497fca1de\") " pod="openstack/kube-state-metrics-0" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.124423 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dd5b3b0-432b-4040-8544-d68497fca1de-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5dd5b3b0-432b-4040-8544-d68497fca1de\") " pod="openstack/kube-state-metrics-0" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.124487 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5dd5b3b0-432b-4040-8544-d68497fca1de-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5dd5b3b0-432b-4040-8544-d68497fca1de\") " pod="openstack/kube-state-metrics-0" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.132062 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dd5b3b0-432b-4040-8544-d68497fca1de-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5dd5b3b0-432b-4040-8544-d68497fca1de\") " pod="openstack/kube-state-metrics-0" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.142948 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5dd5b3b0-432b-4040-8544-d68497fca1de-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5dd5b3b0-432b-4040-8544-d68497fca1de\") " pod="openstack/kube-state-metrics-0" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.145674 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdzw9\" (UniqueName: \"kubernetes.io/projected/5dd5b3b0-432b-4040-8544-d68497fca1de-kube-api-access-zdzw9\") pod \"kube-state-metrics-0\" (UID: \"5dd5b3b0-432b-4040-8544-d68497fca1de\") " pod="openstack/kube-state-metrics-0" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.146892 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd5b3b0-432b-4040-8544-d68497fca1de-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5dd5b3b0-432b-4040-8544-d68497fca1de\") " pod="openstack/kube-state-metrics-0" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.256186 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.302785 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.328457 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-scripts\") pod \"072980fd-9fc2-464a-8c86-4201a00a3bae\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.328664 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/072980fd-9fc2-464a-8c86-4201a00a3bae-run-httpd\") pod \"072980fd-9fc2-464a-8c86-4201a00a3bae\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.328694 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/072980fd-9fc2-464a-8c86-4201a00a3bae-log-httpd\") pod \"072980fd-9fc2-464a-8c86-4201a00a3bae\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.328739 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk8n8\" (UniqueName: \"kubernetes.io/projected/072980fd-9fc2-464a-8c86-4201a00a3bae-kube-api-access-pk8n8\") pod \"072980fd-9fc2-464a-8c86-4201a00a3bae\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.328834 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-sg-core-conf-yaml\") pod \"072980fd-9fc2-464a-8c86-4201a00a3bae\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.328946 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-config-data\") pod \"072980fd-9fc2-464a-8c86-4201a00a3bae\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.329020 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-combined-ca-bundle\") pod \"072980fd-9fc2-464a-8c86-4201a00a3bae\" (UID: \"072980fd-9fc2-464a-8c86-4201a00a3bae\") " Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.331372 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/072980fd-9fc2-464a-8c86-4201a00a3bae-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "072980fd-9fc2-464a-8c86-4201a00a3bae" (UID: "072980fd-9fc2-464a-8c86-4201a00a3bae"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.331754 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/072980fd-9fc2-464a-8c86-4201a00a3bae-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "072980fd-9fc2-464a-8c86-4201a00a3bae" (UID: "072980fd-9fc2-464a-8c86-4201a00a3bae"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.332082 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-scripts" (OuterVolumeSpecName: "scripts") pod "072980fd-9fc2-464a-8c86-4201a00a3bae" (UID: "072980fd-9fc2-464a-8c86-4201a00a3bae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.332921 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/072980fd-9fc2-464a-8c86-4201a00a3bae-kube-api-access-pk8n8" (OuterVolumeSpecName: "kube-api-access-pk8n8") pod "072980fd-9fc2-464a-8c86-4201a00a3bae" (UID: "072980fd-9fc2-464a-8c86-4201a00a3bae"). InnerVolumeSpecName "kube-api-access-pk8n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.371269 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "072980fd-9fc2-464a-8c86-4201a00a3bae" (UID: "072980fd-9fc2-464a-8c86-4201a00a3bae"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.440387 4753 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/072980fd-9fc2-464a-8c86-4201a00a3bae-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.440412 4753 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/072980fd-9fc2-464a-8c86-4201a00a3bae-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.440421 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk8n8\" (UniqueName: \"kubernetes.io/projected/072980fd-9fc2-464a-8c86-4201a00a3bae-kube-api-access-pk8n8\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.440433 4753 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.440443 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.448814 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "072980fd-9fc2-464a-8c86-4201a00a3bae" (UID: "072980fd-9fc2-464a-8c86-4201a00a3bae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.449024 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-config-data" (OuterVolumeSpecName: "config-data") pod "072980fd-9fc2-464a-8c86-4201a00a3bae" (UID: "072980fd-9fc2-464a-8c86-4201a00a3bae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.542380 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.542410 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072980fd-9fc2-464a-8c86-4201a00a3bae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.814242 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 05 20:33:35 crc kubenswrapper[4753]: W1005 20:33:35.815480 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dd5b3b0_432b_4040_8544_d68497fca1de.slice/crio-671a87e99abebbc66e1ea95774f976fc0c7b72382372688049ac3f8bd47bc270 WatchSource:0}: Error finding container 671a87e99abebbc66e1ea95774f976fc0c7b72382372688049ac3f8bd47bc270: Status 404 returned error can't find the container with id 671a87e99abebbc66e1ea95774f976fc0c7b72382372688049ac3f8bd47bc270 Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.863953 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d0d97ee-e7c7-4f1c-b232-b6377a0c890f" path="/var/lib/kubelet/pods/5d0d97ee-e7c7-4f1c-b232-b6377a0c890f/volumes" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.869590 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5dd5b3b0-432b-4040-8544-d68497fca1de","Type":"ContainerStarted","Data":"671a87e99abebbc66e1ea95774f976fc0c7b72382372688049ac3f8bd47bc270"} Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.874338 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.874596 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"072980fd-9fc2-464a-8c86-4201a00a3bae","Type":"ContainerDied","Data":"cf138043f72ad1bd9994af9c39807b7fca6ddc07e00e018b2e2da9b9fe6bc6d3"} Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.874658 4753 scope.go:117] "RemoveContainer" containerID="479a2c9e2d4320203219f6ce98880f91a2d4c825cd0a5eaa5c85ff06f32d8bec" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.899313 4753 scope.go:117] "RemoveContainer" containerID="24888d45a9c6e3091f5b36f909ec1d4356eafe88accf60937c58f1eb4b5a59db" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.914454 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.933397 4753 scope.go:117] "RemoveContainer" containerID="375a2c45be80c2b18c8e6e41b91ad8e21299a50d1ecc3e966854ec9b80d208f3" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.933704 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.938558 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:35 crc kubenswrapper[4753]: E1005 20:33:35.938907 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerName="proxy-httpd" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.938922 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerName="proxy-httpd" Oct 05 20:33:35 crc kubenswrapper[4753]: E1005 20:33:35.938936 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerName="ceilometer-notification-agent" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.938943 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerName="ceilometer-notification-agent" Oct 05 20:33:35 crc kubenswrapper[4753]: E1005 20:33:35.938951 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerName="sg-core" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.938959 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerName="sg-core" Oct 05 20:33:35 crc kubenswrapper[4753]: E1005 20:33:35.938977 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerName="ceilometer-central-agent" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.938983 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerName="ceilometer-central-agent" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.939206 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerName="ceilometer-central-agent" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.939238 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerName="sg-core" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.939250 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerName="ceilometer-notification-agent" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.939258 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" containerName="proxy-httpd" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.940728 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.943626 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.947148 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.947312 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.962549 4753 scope.go:117] "RemoveContainer" containerID="5eb60247cb3ef6af8280a13e7f38dcec1294e0cd49b0053d1ece17713f412310" Oct 05 20:33:35 crc kubenswrapper[4753]: I1005 20:33:35.964958 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.051830 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.051897 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a78918-103b-43ae-ab70-9b0180206f3d-run-httpd\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.051946 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a78918-103b-43ae-ab70-9b0180206f3d-log-httpd\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.051972 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.051991 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-config-data\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.052018 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.052192 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92jpp\" (UniqueName: \"kubernetes.io/projected/92a78918-103b-43ae-ab70-9b0180206f3d-kube-api-access-92jpp\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.052255 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-scripts\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.153879 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.153943 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92jpp\" (UniqueName: \"kubernetes.io/projected/92a78918-103b-43ae-ab70-9b0180206f3d-kube-api-access-92jpp\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.153971 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-scripts\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.154020 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.154056 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a78918-103b-43ae-ab70-9b0180206f3d-run-httpd\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.154091 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a78918-103b-43ae-ab70-9b0180206f3d-log-httpd\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.154113 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.154132 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-config-data\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.155655 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a78918-103b-43ae-ab70-9b0180206f3d-log-httpd\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.155754 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a78918-103b-43ae-ab70-9b0180206f3d-run-httpd\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.160231 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-scripts\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.160653 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.161565 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-config-data\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.161618 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.162287 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.179662 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92jpp\" (UniqueName: \"kubernetes.io/projected/92a78918-103b-43ae-ab70-9b0180206f3d-kube-api-access-92jpp\") pod \"ceilometer-0\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.316209 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.821829 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:36 crc kubenswrapper[4753]: W1005 20:33:36.829663 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92a78918_103b_43ae_ab70_9b0180206f3d.slice/crio-ba6ad238c6ef360abc04e6d073de1cf8d5b2b611738e7860cc560ff5b1b7916f WatchSource:0}: Error finding container ba6ad238c6ef360abc04e6d073de1cf8d5b2b611738e7860cc560ff5b1b7916f: Status 404 returned error can't find the container with id ba6ad238c6ef360abc04e6d073de1cf8d5b2b611738e7860cc560ff5b1b7916f Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.890801 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5dd5b3b0-432b-4040-8544-d68497fca1de","Type":"ContainerStarted","Data":"1e174ba2b006e2a0c918ff6362fe4d12d002e94fc4ac96aca6171654c9dca2a5"} Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.890979 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 05 20:33:36 crc kubenswrapper[4753]: I1005 20:33:36.892373 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a78918-103b-43ae-ab70-9b0180206f3d","Type":"ContainerStarted","Data":"ba6ad238c6ef360abc04e6d073de1cf8d5b2b611738e7860cc560ff5b1b7916f"} Oct 05 20:33:37 crc kubenswrapper[4753]: I1005 20:33:37.477973 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 05 20:33:37 crc kubenswrapper[4753]: I1005 20:33:37.478303 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 05 20:33:37 crc kubenswrapper[4753]: I1005 20:33:37.861743 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="072980fd-9fc2-464a-8c86-4201a00a3bae" path="/var/lib/kubelet/pods/072980fd-9fc2-464a-8c86-4201a00a3bae/volumes" Oct 05 20:33:37 crc kubenswrapper[4753]: I1005 20:33:37.900923 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a78918-103b-43ae-ab70-9b0180206f3d","Type":"ContainerStarted","Data":"795c0827726e33acf2c2701ce3fe485a5f0ecfe0baeeb402e25cd3b0905405d7"} Oct 05 20:33:38 crc kubenswrapper[4753]: I1005 20:33:38.560372 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6ee31d4e-f57f-4b98-89f8-343679496901" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 05 20:33:38 crc kubenswrapper[4753]: I1005 20:33:38.560687 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6ee31d4e-f57f-4b98-89f8-343679496901" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 05 20:33:38 crc kubenswrapper[4753]: I1005 20:33:38.918432 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a78918-103b-43ae-ab70-9b0180206f3d","Type":"ContainerStarted","Data":"d5f5c2d87a339558b8c7d072d0368a65cae97b026863695b0b745901a330aae2"} Oct 05 20:33:39 crc kubenswrapper[4753]: I1005 20:33:39.929808 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a78918-103b-43ae-ab70-9b0180206f3d","Type":"ContainerStarted","Data":"52563c869e0b7d0ff34706d82591063a3c61749c4a619ef0e1a87282cf4b0c13"} Oct 05 20:33:40 crc kubenswrapper[4753]: I1005 20:33:40.939228 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a78918-103b-43ae-ab70-9b0180206f3d","Type":"ContainerStarted","Data":"82fb32666c6c51c7575059470f0d7b9771372741626a985ea5becf40d5ce747b"} Oct 05 20:33:40 crc kubenswrapper[4753]: I1005 20:33:40.962439 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.349082713 podStartE2EDuration="5.962406054s" podCreationTimestamp="2025-10-05 20:33:35 +0000 UTC" firstStartedPulling="2025-10-05 20:33:36.833553455 +0000 UTC m=+1125.681881687" lastFinishedPulling="2025-10-05 20:33:40.446876796 +0000 UTC m=+1129.295205028" observedRunningTime="2025-10-05 20:33:40.957695568 +0000 UTC m=+1129.806023800" watchObservedRunningTime="2025-10-05 20:33:40.962406054 +0000 UTC m=+1129.810734306" Oct 05 20:33:40 crc kubenswrapper[4753]: I1005 20:33:40.975470 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=6.560715105 podStartE2EDuration="6.975451268s" podCreationTimestamp="2025-10-05 20:33:34 +0000 UTC" firstStartedPulling="2025-10-05 20:33:35.817356239 +0000 UTC m=+1124.665684471" lastFinishedPulling="2025-10-05 20:33:36.232092412 +0000 UTC m=+1125.080420634" observedRunningTime="2025-10-05 20:33:36.912444272 +0000 UTC m=+1125.760772504" watchObservedRunningTime="2025-10-05 20:33:40.975451268 +0000 UTC m=+1129.823779500" Oct 05 20:33:41 crc kubenswrapper[4753]: I1005 20:33:41.947912 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 05 20:33:42 crc kubenswrapper[4753]: I1005 20:33:42.105848 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 05 20:33:42 crc kubenswrapper[4753]: I1005 20:33:42.109930 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 05 20:33:42 crc kubenswrapper[4753]: I1005 20:33:42.120886 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 05 20:33:42 crc kubenswrapper[4753]: I1005 20:33:42.962985 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 05 20:33:45 crc kubenswrapper[4753]: I1005 20:33:45.317452 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 05 20:33:45 crc kubenswrapper[4753]: I1005 20:33:45.965376 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.002907 4753 generic.go:334] "Generic (PLEG): container finished" podID="a2d24ec3-181b-4651-ab1d-59f1975c052a" containerID="b87b29ecafec68b68c54b7c5e2d72250c93a4358248dd72cbd1a0369ae13f570" exitCode=137 Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.002951 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a2d24ec3-181b-4651-ab1d-59f1975c052a","Type":"ContainerDied","Data":"b87b29ecafec68b68c54b7c5e2d72250c93a4358248dd72cbd1a0369ae13f570"} Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.002980 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a2d24ec3-181b-4651-ab1d-59f1975c052a","Type":"ContainerDied","Data":"b8572bee91d93572d19249f9bcbae54b042c1c9fd4849e0d0dc5b355d91d660e"} Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.003000 4753 scope.go:117] "RemoveContainer" containerID="b87b29ecafec68b68c54b7c5e2d72250c93a4358248dd72cbd1a0369ae13f570" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.003111 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.028837 4753 scope.go:117] "RemoveContainer" containerID="b87b29ecafec68b68c54b7c5e2d72250c93a4358248dd72cbd1a0369ae13f570" Oct 05 20:33:46 crc kubenswrapper[4753]: E1005 20:33:46.029343 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b87b29ecafec68b68c54b7c5e2d72250c93a4358248dd72cbd1a0369ae13f570\": container with ID starting with b87b29ecafec68b68c54b7c5e2d72250c93a4358248dd72cbd1a0369ae13f570 not found: ID does not exist" containerID="b87b29ecafec68b68c54b7c5e2d72250c93a4358248dd72cbd1a0369ae13f570" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.029381 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b87b29ecafec68b68c54b7c5e2d72250c93a4358248dd72cbd1a0369ae13f570"} err="failed to get container status \"b87b29ecafec68b68c54b7c5e2d72250c93a4358248dd72cbd1a0369ae13f570\": rpc error: code = NotFound desc = could not find container \"b87b29ecafec68b68c54b7c5e2d72250c93a4358248dd72cbd1a0369ae13f570\": container with ID starting with b87b29ecafec68b68c54b7c5e2d72250c93a4358248dd72cbd1a0369ae13f570 not found: ID does not exist" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.129975 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d24ec3-181b-4651-ab1d-59f1975c052a-combined-ca-bundle\") pod \"a2d24ec3-181b-4651-ab1d-59f1975c052a\" (UID: \"a2d24ec3-181b-4651-ab1d-59f1975c052a\") " Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.130075 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d24ec3-181b-4651-ab1d-59f1975c052a-config-data\") pod \"a2d24ec3-181b-4651-ab1d-59f1975c052a\" (UID: \"a2d24ec3-181b-4651-ab1d-59f1975c052a\") " Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.130219 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkrsb\" (UniqueName: \"kubernetes.io/projected/a2d24ec3-181b-4651-ab1d-59f1975c052a-kube-api-access-vkrsb\") pod \"a2d24ec3-181b-4651-ab1d-59f1975c052a\" (UID: \"a2d24ec3-181b-4651-ab1d-59f1975c052a\") " Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.135728 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2d24ec3-181b-4651-ab1d-59f1975c052a-kube-api-access-vkrsb" (OuterVolumeSpecName: "kube-api-access-vkrsb") pod "a2d24ec3-181b-4651-ab1d-59f1975c052a" (UID: "a2d24ec3-181b-4651-ab1d-59f1975c052a"). InnerVolumeSpecName "kube-api-access-vkrsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.161290 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2d24ec3-181b-4651-ab1d-59f1975c052a-config-data" (OuterVolumeSpecName: "config-data") pod "a2d24ec3-181b-4651-ab1d-59f1975c052a" (UID: "a2d24ec3-181b-4651-ab1d-59f1975c052a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.164409 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2d24ec3-181b-4651-ab1d-59f1975c052a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2d24ec3-181b-4651-ab1d-59f1975c052a" (UID: "a2d24ec3-181b-4651-ab1d-59f1975c052a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.232823 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d24ec3-181b-4651-ab1d-59f1975c052a-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.232860 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkrsb\" (UniqueName: \"kubernetes.io/projected/a2d24ec3-181b-4651-ab1d-59f1975c052a-kube-api-access-vkrsb\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.232881 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d24ec3-181b-4651-ab1d-59f1975c052a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.331074 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.339549 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.360826 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 05 20:33:46 crc kubenswrapper[4753]: E1005 20:33:46.361201 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d24ec3-181b-4651-ab1d-59f1975c052a" containerName="nova-cell1-novncproxy-novncproxy" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.361218 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d24ec3-181b-4651-ab1d-59f1975c052a" containerName="nova-cell1-novncproxy-novncproxy" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.361382 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d24ec3-181b-4651-ab1d-59f1975c052a" containerName="nova-cell1-novncproxy-novncproxy" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.361924 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.364797 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.365259 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.380112 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.382817 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.538025 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/be546d4c-4192-4338-aaf3-2849807daf9d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"be546d4c-4192-4338-aaf3-2849807daf9d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.538102 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be546d4c-4192-4338-aaf3-2849807daf9d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"be546d4c-4192-4338-aaf3-2849807daf9d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.538242 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be546d4c-4192-4338-aaf3-2849807daf9d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"be546d4c-4192-4338-aaf3-2849807daf9d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.538474 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bp9d\" (UniqueName: \"kubernetes.io/projected/be546d4c-4192-4338-aaf3-2849807daf9d-kube-api-access-6bp9d\") pod \"nova-cell1-novncproxy-0\" (UID: \"be546d4c-4192-4338-aaf3-2849807daf9d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.538545 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/be546d4c-4192-4338-aaf3-2849807daf9d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"be546d4c-4192-4338-aaf3-2849807daf9d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.640644 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be546d4c-4192-4338-aaf3-2849807daf9d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"be546d4c-4192-4338-aaf3-2849807daf9d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.640735 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be546d4c-4192-4338-aaf3-2849807daf9d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"be546d4c-4192-4338-aaf3-2849807daf9d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.640781 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bp9d\" (UniqueName: \"kubernetes.io/projected/be546d4c-4192-4338-aaf3-2849807daf9d-kube-api-access-6bp9d\") pod \"nova-cell1-novncproxy-0\" (UID: \"be546d4c-4192-4338-aaf3-2849807daf9d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.640802 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/be546d4c-4192-4338-aaf3-2849807daf9d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"be546d4c-4192-4338-aaf3-2849807daf9d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.640847 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/be546d4c-4192-4338-aaf3-2849807daf9d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"be546d4c-4192-4338-aaf3-2849807daf9d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.660261 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be546d4c-4192-4338-aaf3-2849807daf9d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"be546d4c-4192-4338-aaf3-2849807daf9d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.660666 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/be546d4c-4192-4338-aaf3-2849807daf9d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"be546d4c-4192-4338-aaf3-2849807daf9d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.661018 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/be546d4c-4192-4338-aaf3-2849807daf9d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"be546d4c-4192-4338-aaf3-2849807daf9d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.661407 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be546d4c-4192-4338-aaf3-2849807daf9d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"be546d4c-4192-4338-aaf3-2849807daf9d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.674776 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bp9d\" (UniqueName: \"kubernetes.io/projected/be546d4c-4192-4338-aaf3-2849807daf9d-kube-api-access-6bp9d\") pod \"nova-cell1-novncproxy-0\" (UID: \"be546d4c-4192-4338-aaf3-2849807daf9d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:46 crc kubenswrapper[4753]: I1005 20:33:46.680879 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:47 crc kubenswrapper[4753]: I1005 20:33:47.146906 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 05 20:33:47 crc kubenswrapper[4753]: I1005 20:33:47.485103 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 05 20:33:47 crc kubenswrapper[4753]: I1005 20:33:47.485725 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 05 20:33:47 crc kubenswrapper[4753]: I1005 20:33:47.490617 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 05 20:33:47 crc kubenswrapper[4753]: I1005 20:33:47.492684 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 05 20:33:47 crc kubenswrapper[4753]: I1005 20:33:47.862728 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2d24ec3-181b-4651-ab1d-59f1975c052a" path="/var/lib/kubelet/pods/a2d24ec3-181b-4651-ab1d-59f1975c052a/volumes" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.022453 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"be546d4c-4192-4338-aaf3-2849807daf9d","Type":"ContainerStarted","Data":"4730c083cc22423fc41dca770840e72206aa6408d76e0879617498fac1333eb4"} Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.022485 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"be546d4c-4192-4338-aaf3-2849807daf9d","Type":"ContainerStarted","Data":"3d8c712fa3039ca500af63664865e17be01b0a14982c1cef10fb7ffdd44892ca"} Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.022498 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.038421 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.038406735 podStartE2EDuration="2.038406735s" podCreationTimestamp="2025-10-05 20:33:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:33:48.036716413 +0000 UTC m=+1136.885044645" watchObservedRunningTime="2025-10-05 20:33:48.038406735 +0000 UTC m=+1136.886734967" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.048936 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.273188 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7759979f65-h9vzh"] Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.278298 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.297157 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7759979f65-h9vzh"] Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.377423 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km76q\" (UniqueName: \"kubernetes.io/projected/4b6d8549-ef82-43f6-bc04-97565906e391-kube-api-access-km76q\") pod \"dnsmasq-dns-7759979f65-h9vzh\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.377464 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-ovsdbserver-nb\") pod \"dnsmasq-dns-7759979f65-h9vzh\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.377490 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-dns-svc\") pod \"dnsmasq-dns-7759979f65-h9vzh\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.377533 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-ovsdbserver-sb\") pod \"dnsmasq-dns-7759979f65-h9vzh\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.377603 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-config\") pod \"dnsmasq-dns-7759979f65-h9vzh\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.479122 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-config\") pod \"dnsmasq-dns-7759979f65-h9vzh\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.479223 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km76q\" (UniqueName: \"kubernetes.io/projected/4b6d8549-ef82-43f6-bc04-97565906e391-kube-api-access-km76q\") pod \"dnsmasq-dns-7759979f65-h9vzh\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.479246 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-ovsdbserver-nb\") pod \"dnsmasq-dns-7759979f65-h9vzh\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.479269 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-dns-svc\") pod \"dnsmasq-dns-7759979f65-h9vzh\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.479308 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-ovsdbserver-sb\") pod \"dnsmasq-dns-7759979f65-h9vzh\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.480596 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-ovsdbserver-nb\") pod \"dnsmasq-dns-7759979f65-h9vzh\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.480803 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-dns-svc\") pod \"dnsmasq-dns-7759979f65-h9vzh\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.480848 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-ovsdbserver-sb\") pod \"dnsmasq-dns-7759979f65-h9vzh\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.481646 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-config\") pod \"dnsmasq-dns-7759979f65-h9vzh\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.508230 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km76q\" (UniqueName: \"kubernetes.io/projected/4b6d8549-ef82-43f6-bc04-97565906e391-kube-api-access-km76q\") pod \"dnsmasq-dns-7759979f65-h9vzh\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:48 crc kubenswrapper[4753]: I1005 20:33:48.596754 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:49 crc kubenswrapper[4753]: I1005 20:33:49.134186 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7759979f65-h9vzh"] Oct 05 20:33:50 crc kubenswrapper[4753]: I1005 20:33:50.038375 4753 generic.go:334] "Generic (PLEG): container finished" podID="4b6d8549-ef82-43f6-bc04-97565906e391" containerID="55ec1bf876d6b75234d1facc4d30770dd47bd245958787b36f755fda13e3935d" exitCode=0 Oct 05 20:33:50 crc kubenswrapper[4753]: I1005 20:33:50.039728 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7759979f65-h9vzh" event={"ID":"4b6d8549-ef82-43f6-bc04-97565906e391","Type":"ContainerDied","Data":"55ec1bf876d6b75234d1facc4d30770dd47bd245958787b36f755fda13e3935d"} Oct 05 20:33:50 crc kubenswrapper[4753]: I1005 20:33:50.039781 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7759979f65-h9vzh" event={"ID":"4b6d8549-ef82-43f6-bc04-97565906e391","Type":"ContainerStarted","Data":"8380009d62996cd7a1a4e6f7afb4332688c117ef4e4000d82f091363fcd73c7a"} Oct 05 20:33:51 crc kubenswrapper[4753]: I1005 20:33:51.030538 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:33:51 crc kubenswrapper[4753]: I1005 20:33:51.048010 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7759979f65-h9vzh" event={"ID":"4b6d8549-ef82-43f6-bc04-97565906e391","Type":"ContainerStarted","Data":"241813e054282349133be92070c123e288e40424449341e84e28a6e89126f78f"} Oct 05 20:33:51 crc kubenswrapper[4753]: I1005 20:33:51.048244 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:51 crc kubenswrapper[4753]: I1005 20:33:51.049104 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6ee31d4e-f57f-4b98-89f8-343679496901" containerName="nova-api-log" containerID="cri-o://db1589ed139a8f143b3b0098caa03a5ed5405c0811b32140129d3cd4cf32dfe3" gracePeriod=30 Oct 05 20:33:51 crc kubenswrapper[4753]: I1005 20:33:51.049131 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6ee31d4e-f57f-4b98-89f8-343679496901" containerName="nova-api-api" containerID="cri-o://67a86a61bfee73d54d1c41c85d384c50fbd39cef8867216a65775eae05b718d5" gracePeriod=30 Oct 05 20:33:51 crc kubenswrapper[4753]: I1005 20:33:51.073071 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7759979f65-h9vzh" podStartSLOduration=3.073043108 podStartE2EDuration="3.073043108s" podCreationTimestamp="2025-10-05 20:33:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:33:51.067850568 +0000 UTC m=+1139.916178800" watchObservedRunningTime="2025-10-05 20:33:51.073043108 +0000 UTC m=+1139.921371340" Oct 05 20:33:51 crc kubenswrapper[4753]: I1005 20:33:51.297132 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:51 crc kubenswrapper[4753]: I1005 20:33:51.298925 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="ceilometer-notification-agent" containerID="cri-o://d5f5c2d87a339558b8c7d072d0368a65cae97b026863695b0b745901a330aae2" gracePeriod=30 Oct 05 20:33:51 crc kubenswrapper[4753]: I1005 20:33:51.298975 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="proxy-httpd" containerID="cri-o://82fb32666c6c51c7575059470f0d7b9771372741626a985ea5becf40d5ce747b" gracePeriod=30 Oct 05 20:33:51 crc kubenswrapper[4753]: I1005 20:33:51.299282 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="sg-core" containerID="cri-o://52563c869e0b7d0ff34706d82591063a3c61749c4a619ef0e1a87282cf4b0c13" gracePeriod=30 Oct 05 20:33:51 crc kubenswrapper[4753]: I1005 20:33:51.299434 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="ceilometer-central-agent" containerID="cri-o://795c0827726e33acf2c2701ce3fe485a5f0ecfe0baeeb402e25cd3b0905405d7" gracePeriod=30 Oct 05 20:33:51 crc kubenswrapper[4753]: I1005 20:33:51.308116 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.182:3000/\": EOF" Oct 05 20:33:51 crc kubenswrapper[4753]: I1005 20:33:51.681882 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:52 crc kubenswrapper[4753]: I1005 20:33:52.056627 4753 generic.go:334] "Generic (PLEG): container finished" podID="92a78918-103b-43ae-ab70-9b0180206f3d" containerID="82fb32666c6c51c7575059470f0d7b9771372741626a985ea5becf40d5ce747b" exitCode=0 Oct 05 20:33:52 crc kubenswrapper[4753]: I1005 20:33:52.056940 4753 generic.go:334] "Generic (PLEG): container finished" podID="92a78918-103b-43ae-ab70-9b0180206f3d" containerID="52563c869e0b7d0ff34706d82591063a3c61749c4a619ef0e1a87282cf4b0c13" exitCode=2 Oct 05 20:33:52 crc kubenswrapper[4753]: I1005 20:33:52.056948 4753 generic.go:334] "Generic (PLEG): container finished" podID="92a78918-103b-43ae-ab70-9b0180206f3d" containerID="795c0827726e33acf2c2701ce3fe485a5f0ecfe0baeeb402e25cd3b0905405d7" exitCode=0 Oct 05 20:33:52 crc kubenswrapper[4753]: I1005 20:33:52.056998 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a78918-103b-43ae-ab70-9b0180206f3d","Type":"ContainerDied","Data":"82fb32666c6c51c7575059470f0d7b9771372741626a985ea5becf40d5ce747b"} Oct 05 20:33:52 crc kubenswrapper[4753]: I1005 20:33:52.057021 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a78918-103b-43ae-ab70-9b0180206f3d","Type":"ContainerDied","Data":"52563c869e0b7d0ff34706d82591063a3c61749c4a619ef0e1a87282cf4b0c13"} Oct 05 20:33:52 crc kubenswrapper[4753]: I1005 20:33:52.057030 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a78918-103b-43ae-ab70-9b0180206f3d","Type":"ContainerDied","Data":"795c0827726e33acf2c2701ce3fe485a5f0ecfe0baeeb402e25cd3b0905405d7"} Oct 05 20:33:52 crc kubenswrapper[4753]: I1005 20:33:52.058708 4753 generic.go:334] "Generic (PLEG): container finished" podID="6ee31d4e-f57f-4b98-89f8-343679496901" containerID="db1589ed139a8f143b3b0098caa03a5ed5405c0811b32140129d3cd4cf32dfe3" exitCode=143 Oct 05 20:33:52 crc kubenswrapper[4753]: I1005 20:33:52.059393 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6ee31d4e-f57f-4b98-89f8-343679496901","Type":"ContainerDied","Data":"db1589ed139a8f143b3b0098caa03a5ed5405c0811b32140129d3cd4cf32dfe3"} Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.086588 4753 generic.go:334] "Generic (PLEG): container finished" podID="92a78918-103b-43ae-ab70-9b0180206f3d" containerID="d5f5c2d87a339558b8c7d072d0368a65cae97b026863695b0b745901a330aae2" exitCode=0 Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.086712 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a78918-103b-43ae-ab70-9b0180206f3d","Type":"ContainerDied","Data":"d5f5c2d87a339558b8c7d072d0368a65cae97b026863695b0b745901a330aae2"} Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.087112 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a78918-103b-43ae-ab70-9b0180206f3d","Type":"ContainerDied","Data":"ba6ad238c6ef360abc04e6d073de1cf8d5b2b611738e7860cc560ff5b1b7916f"} Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.087127 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba6ad238c6ef360abc04e6d073de1cf8d5b2b611738e7860cc560ff5b1b7916f" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.089086 4753 generic.go:334] "Generic (PLEG): container finished" podID="6ee31d4e-f57f-4b98-89f8-343679496901" containerID="67a86a61bfee73d54d1c41c85d384c50fbd39cef8867216a65775eae05b718d5" exitCode=0 Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.089118 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6ee31d4e-f57f-4b98-89f8-343679496901","Type":"ContainerDied","Data":"67a86a61bfee73d54d1c41c85d384c50fbd39cef8867216a65775eae05b718d5"} Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.094295 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.210477 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-config-data\") pod \"92a78918-103b-43ae-ab70-9b0180206f3d\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.210537 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-sg-core-conf-yaml\") pod \"92a78918-103b-43ae-ab70-9b0180206f3d\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.210574 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-ceilometer-tls-certs\") pod \"92a78918-103b-43ae-ab70-9b0180206f3d\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.210682 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-combined-ca-bundle\") pod \"92a78918-103b-43ae-ab70-9b0180206f3d\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.210709 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a78918-103b-43ae-ab70-9b0180206f3d-log-httpd\") pod \"92a78918-103b-43ae-ab70-9b0180206f3d\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.210735 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a78918-103b-43ae-ab70-9b0180206f3d-run-httpd\") pod \"92a78918-103b-43ae-ab70-9b0180206f3d\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.210777 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-scripts\") pod \"92a78918-103b-43ae-ab70-9b0180206f3d\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.210821 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92jpp\" (UniqueName: \"kubernetes.io/projected/92a78918-103b-43ae-ab70-9b0180206f3d-kube-api-access-92jpp\") pod \"92a78918-103b-43ae-ab70-9b0180206f3d\" (UID: \"92a78918-103b-43ae-ab70-9b0180206f3d\") " Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.211848 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a78918-103b-43ae-ab70-9b0180206f3d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "92a78918-103b-43ae-ab70-9b0180206f3d" (UID: "92a78918-103b-43ae-ab70-9b0180206f3d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.212229 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a78918-103b-43ae-ab70-9b0180206f3d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "92a78918-103b-43ae-ab70-9b0180206f3d" (UID: "92a78918-103b-43ae-ab70-9b0180206f3d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.219054 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-scripts" (OuterVolumeSpecName: "scripts") pod "92a78918-103b-43ae-ab70-9b0180206f3d" (UID: "92a78918-103b-43ae-ab70-9b0180206f3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.228628 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a78918-103b-43ae-ab70-9b0180206f3d-kube-api-access-92jpp" (OuterVolumeSpecName: "kube-api-access-92jpp") pod "92a78918-103b-43ae-ab70-9b0180206f3d" (UID: "92a78918-103b-43ae-ab70-9b0180206f3d"). InnerVolumeSpecName "kube-api-access-92jpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.241461 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "92a78918-103b-43ae-ab70-9b0180206f3d" (UID: "92a78918-103b-43ae-ab70-9b0180206f3d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.264232 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "92a78918-103b-43ae-ab70-9b0180206f3d" (UID: "92a78918-103b-43ae-ab70-9b0180206f3d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.307435 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92a78918-103b-43ae-ab70-9b0180206f3d" (UID: "92a78918-103b-43ae-ab70-9b0180206f3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.314731 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92jpp\" (UniqueName: \"kubernetes.io/projected/92a78918-103b-43ae-ab70-9b0180206f3d-kube-api-access-92jpp\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.314754 4753 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.314763 4753 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.314771 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.314781 4753 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a78918-103b-43ae-ab70-9b0180206f3d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.314790 4753 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a78918-103b-43ae-ab70-9b0180206f3d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.314799 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.316091 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-config-data" (OuterVolumeSpecName: "config-data") pod "92a78918-103b-43ae-ab70-9b0180206f3d" (UID: "92a78918-103b-43ae-ab70-9b0180206f3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.381332 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.416334 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a78918-103b-43ae-ab70-9b0180206f3d-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.517758 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ee31d4e-f57f-4b98-89f8-343679496901-config-data\") pod \"6ee31d4e-f57f-4b98-89f8-343679496901\" (UID: \"6ee31d4e-f57f-4b98-89f8-343679496901\") " Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.517929 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ee31d4e-f57f-4b98-89f8-343679496901-combined-ca-bundle\") pod \"6ee31d4e-f57f-4b98-89f8-343679496901\" (UID: \"6ee31d4e-f57f-4b98-89f8-343679496901\") " Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.517970 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ee31d4e-f57f-4b98-89f8-343679496901-logs\") pod \"6ee31d4e-f57f-4b98-89f8-343679496901\" (UID: \"6ee31d4e-f57f-4b98-89f8-343679496901\") " Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.518003 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jtng\" (UniqueName: \"kubernetes.io/projected/6ee31d4e-f57f-4b98-89f8-343679496901-kube-api-access-9jtng\") pod \"6ee31d4e-f57f-4b98-89f8-343679496901\" (UID: \"6ee31d4e-f57f-4b98-89f8-343679496901\") " Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.518588 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee31d4e-f57f-4b98-89f8-343679496901-logs" (OuterVolumeSpecName: "logs") pod "6ee31d4e-f57f-4b98-89f8-343679496901" (UID: "6ee31d4e-f57f-4b98-89f8-343679496901"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.542042 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee31d4e-f57f-4b98-89f8-343679496901-kube-api-access-9jtng" (OuterVolumeSpecName: "kube-api-access-9jtng") pod "6ee31d4e-f57f-4b98-89f8-343679496901" (UID: "6ee31d4e-f57f-4b98-89f8-343679496901"). InnerVolumeSpecName "kube-api-access-9jtng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.558537 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee31d4e-f57f-4b98-89f8-343679496901-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ee31d4e-f57f-4b98-89f8-343679496901" (UID: "6ee31d4e-f57f-4b98-89f8-343679496901"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.565165 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee31d4e-f57f-4b98-89f8-343679496901-config-data" (OuterVolumeSpecName: "config-data") pod "6ee31d4e-f57f-4b98-89f8-343679496901" (UID: "6ee31d4e-f57f-4b98-89f8-343679496901"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.620540 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ee31d4e-f57f-4b98-89f8-343679496901-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.620580 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ee31d4e-f57f-4b98-89f8-343679496901-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.620597 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ee31d4e-f57f-4b98-89f8-343679496901-logs\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:55 crc kubenswrapper[4753]: I1005 20:33:55.620613 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jtng\" (UniqueName: \"kubernetes.io/projected/6ee31d4e-f57f-4b98-89f8-343679496901-kube-api-access-9jtng\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.098764 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6ee31d4e-f57f-4b98-89f8-343679496901","Type":"ContainerDied","Data":"d243ebfa12f509b36a4da0d5c05a33dd6f4cf3a5c68e859495a701c0b0c2abf1"} Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.098792 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.098823 4753 scope.go:117] "RemoveContainer" containerID="67a86a61bfee73d54d1c41c85d384c50fbd39cef8867216a65775eae05b718d5" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.098956 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.121820 4753 scope.go:117] "RemoveContainer" containerID="db1589ed139a8f143b3b0098caa03a5ed5405c0811b32140129d3cd4cf32dfe3" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.133551 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.141411 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.153833 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.166896 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.183249 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:56 crc kubenswrapper[4753]: E1005 20:33:56.183641 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee31d4e-f57f-4b98-89f8-343679496901" containerName="nova-api-log" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.183659 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee31d4e-f57f-4b98-89f8-343679496901" containerName="nova-api-log" Oct 05 20:33:56 crc kubenswrapper[4753]: E1005 20:33:56.183675 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee31d4e-f57f-4b98-89f8-343679496901" containerName="nova-api-api" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.183683 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee31d4e-f57f-4b98-89f8-343679496901" containerName="nova-api-api" Oct 05 20:33:56 crc kubenswrapper[4753]: E1005 20:33:56.183698 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="ceilometer-central-agent" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.183708 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="ceilometer-central-agent" Oct 05 20:33:56 crc kubenswrapper[4753]: E1005 20:33:56.183724 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="proxy-httpd" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.183729 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="proxy-httpd" Oct 05 20:33:56 crc kubenswrapper[4753]: E1005 20:33:56.183745 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="ceilometer-notification-agent" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.183751 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="ceilometer-notification-agent" Oct 05 20:33:56 crc kubenswrapper[4753]: E1005 20:33:56.183772 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="sg-core" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.183779 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="sg-core" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.183939 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="ceilometer-central-agent" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.183953 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="ceilometer-notification-agent" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.183966 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ee31d4e-f57f-4b98-89f8-343679496901" containerName="nova-api-api" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.183974 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="sg-core" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.183987 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ee31d4e-f57f-4b98-89f8-343679496901" containerName="nova-api-log" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.183995 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" containerName="proxy-httpd" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.185742 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.191164 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.191693 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.191967 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.192675 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.206853 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.208259 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.211667 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.211822 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.211960 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.226564 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.335775 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-run-httpd\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.336017 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx4fn\" (UniqueName: \"kubernetes.io/projected/009e7aaf-1c44-473c-97ef-9e0422e87607-kube-api-access-gx4fn\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.336193 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.336297 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-log-httpd\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.336371 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-internal-tls-certs\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.336459 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-public-tls-certs\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.336529 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-config-data\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.336584 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-config-data\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.336614 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.336692 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/009e7aaf-1c44-473c-97ef-9e0422e87607-logs\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.336733 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-scripts\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.336764 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m25hk\" (UniqueName: \"kubernetes.io/projected/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-kube-api-access-m25hk\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.336784 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.336801 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.438157 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.438215 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-log-httpd\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.438253 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-internal-tls-certs\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.438290 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-public-tls-certs\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.438327 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-config-data\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.438359 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-config-data\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.438386 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.438417 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/009e7aaf-1c44-473c-97ef-9e0422e87607-logs\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.438446 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-scripts\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.438466 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m25hk\" (UniqueName: \"kubernetes.io/projected/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-kube-api-access-m25hk\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.438491 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.438511 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.438561 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx4fn\" (UniqueName: \"kubernetes.io/projected/009e7aaf-1c44-473c-97ef-9e0422e87607-kube-api-access-gx4fn\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.438582 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-run-httpd\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.438784 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-log-httpd\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.438940 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-run-httpd\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.442354 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.442377 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.442997 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/009e7aaf-1c44-473c-97ef-9e0422e87607-logs\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.444515 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.445715 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-config-data\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.446001 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-config-data\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.446096 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-scripts\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.448659 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.448671 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-public-tls-certs\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.451669 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-internal-tls-certs\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.457617 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m25hk\" (UniqueName: \"kubernetes.io/projected/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-kube-api-access-m25hk\") pod \"ceilometer-0\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.461413 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx4fn\" (UniqueName: \"kubernetes.io/projected/009e7aaf-1c44-473c-97ef-9e0422e87607-kube-api-access-gx4fn\") pod \"nova-api-0\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.499962 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.572496 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.683017 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:56 crc kubenswrapper[4753]: I1005 20:33:56.703162 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:57 crc kubenswrapper[4753]: W1005 20:33:57.010206 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde29aa2f_3662_4b9b_a32d_f6e0da626cb6.slice/crio-41a928f4e1c3e65ac5a846fb29d04a09145e0d3c7f72a3231d5a8e9985030b2e WatchSource:0}: Error finding container 41a928f4e1c3e65ac5a846fb29d04a09145e0d3c7f72a3231d5a8e9985030b2e: Status 404 returned error can't find the container with id 41a928f4e1c3e65ac5a846fb29d04a09145e0d3c7f72a3231d5a8e9985030b2e Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.012100 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.111535 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de29aa2f-3662-4b9b-a32d-f6e0da626cb6","Type":"ContainerStarted","Data":"41a928f4e1c3e65ac5a846fb29d04a09145e0d3c7f72a3231d5a8e9985030b2e"} Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.135954 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:33:57 crc kubenswrapper[4753]: W1005 20:33:57.138382 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod009e7aaf_1c44_473c_97ef_9e0422e87607.slice/crio-9aacac02c52ab01d9c79cddb83d8c32b8f0305cfe511aa2a570ae71475acc2b0 WatchSource:0}: Error finding container 9aacac02c52ab01d9c79cddb83d8c32b8f0305cfe511aa2a570ae71475acc2b0: Status 404 returned error can't find the container with id 9aacac02c52ab01d9c79cddb83d8c32b8f0305cfe511aa2a570ae71475acc2b0 Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.148556 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.329546 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-2cbwk"] Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.331334 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2cbwk" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.336172 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.336580 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.341185 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2cbwk"] Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.457745 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj6km\" (UniqueName: \"kubernetes.io/projected/25d85233-42bd-4ee4-9d31-c0de142846b3-kube-api-access-dj6km\") pod \"nova-cell1-cell-mapping-2cbwk\" (UID: \"25d85233-42bd-4ee4-9d31-c0de142846b3\") " pod="openstack/nova-cell1-cell-mapping-2cbwk" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.457862 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-scripts\") pod \"nova-cell1-cell-mapping-2cbwk\" (UID: \"25d85233-42bd-4ee4-9d31-c0de142846b3\") " pod="openstack/nova-cell1-cell-mapping-2cbwk" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.457893 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2cbwk\" (UID: \"25d85233-42bd-4ee4-9d31-c0de142846b3\") " pod="openstack/nova-cell1-cell-mapping-2cbwk" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.457910 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-config-data\") pod \"nova-cell1-cell-mapping-2cbwk\" (UID: \"25d85233-42bd-4ee4-9d31-c0de142846b3\") " pod="openstack/nova-cell1-cell-mapping-2cbwk" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.560266 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-scripts\") pod \"nova-cell1-cell-mapping-2cbwk\" (UID: \"25d85233-42bd-4ee4-9d31-c0de142846b3\") " pod="openstack/nova-cell1-cell-mapping-2cbwk" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.560335 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2cbwk\" (UID: \"25d85233-42bd-4ee4-9d31-c0de142846b3\") " pod="openstack/nova-cell1-cell-mapping-2cbwk" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.560353 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-config-data\") pod \"nova-cell1-cell-mapping-2cbwk\" (UID: \"25d85233-42bd-4ee4-9d31-c0de142846b3\") " pod="openstack/nova-cell1-cell-mapping-2cbwk" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.560427 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj6km\" (UniqueName: \"kubernetes.io/projected/25d85233-42bd-4ee4-9d31-c0de142846b3-kube-api-access-dj6km\") pod \"nova-cell1-cell-mapping-2cbwk\" (UID: \"25d85233-42bd-4ee4-9d31-c0de142846b3\") " pod="openstack/nova-cell1-cell-mapping-2cbwk" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.565311 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-scripts\") pod \"nova-cell1-cell-mapping-2cbwk\" (UID: \"25d85233-42bd-4ee4-9d31-c0de142846b3\") " pod="openstack/nova-cell1-cell-mapping-2cbwk" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.565868 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-config-data\") pod \"nova-cell1-cell-mapping-2cbwk\" (UID: \"25d85233-42bd-4ee4-9d31-c0de142846b3\") " pod="openstack/nova-cell1-cell-mapping-2cbwk" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.566920 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2cbwk\" (UID: \"25d85233-42bd-4ee4-9d31-c0de142846b3\") " pod="openstack/nova-cell1-cell-mapping-2cbwk" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.578010 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj6km\" (UniqueName: \"kubernetes.io/projected/25d85233-42bd-4ee4-9d31-c0de142846b3-kube-api-access-dj6km\") pod \"nova-cell1-cell-mapping-2cbwk\" (UID: \"25d85233-42bd-4ee4-9d31-c0de142846b3\") " pod="openstack/nova-cell1-cell-mapping-2cbwk" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.691827 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2cbwk" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.877409 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee31d4e-f57f-4b98-89f8-343679496901" path="/var/lib/kubelet/pods/6ee31d4e-f57f-4b98-89f8-343679496901/volumes" Oct 05 20:33:57 crc kubenswrapper[4753]: I1005 20:33:57.878436 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a78918-103b-43ae-ab70-9b0180206f3d" path="/var/lib/kubelet/pods/92a78918-103b-43ae-ab70-9b0180206f3d/volumes" Oct 05 20:33:58 crc kubenswrapper[4753]: I1005 20:33:58.121945 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de29aa2f-3662-4b9b-a32d-f6e0da626cb6","Type":"ContainerStarted","Data":"422ccff053fe6a7167f44481ef576ba9d402cc60a51f0c18fa2b2888805db045"} Oct 05 20:33:58 crc kubenswrapper[4753]: I1005 20:33:58.125075 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"009e7aaf-1c44-473c-97ef-9e0422e87607","Type":"ContainerStarted","Data":"ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d"} Oct 05 20:33:58 crc kubenswrapper[4753]: I1005 20:33:58.125106 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"009e7aaf-1c44-473c-97ef-9e0422e87607","Type":"ContainerStarted","Data":"5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e"} Oct 05 20:33:58 crc kubenswrapper[4753]: I1005 20:33:58.125168 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"009e7aaf-1c44-473c-97ef-9e0422e87607","Type":"ContainerStarted","Data":"9aacac02c52ab01d9c79cddb83d8c32b8f0305cfe511aa2a570ae71475acc2b0"} Oct 05 20:33:58 crc kubenswrapper[4753]: I1005 20:33:58.148830 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.148806592 podStartE2EDuration="2.148806592s" podCreationTimestamp="2025-10-05 20:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:33:58.142344011 +0000 UTC m=+1146.990672243" watchObservedRunningTime="2025-10-05 20:33:58.148806592 +0000 UTC m=+1146.997134824" Oct 05 20:33:58 crc kubenswrapper[4753]: I1005 20:33:58.214520 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2cbwk"] Oct 05 20:33:58 crc kubenswrapper[4753]: I1005 20:33:58.599278 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:33:58 crc kubenswrapper[4753]: I1005 20:33:58.656050 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b86468d5c-ghpzp"] Oct 05 20:33:58 crc kubenswrapper[4753]: I1005 20:33:58.656276 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" podUID="9bcf51ef-4faa-4209-bbf8-b17d7177d0b5" containerName="dnsmasq-dns" containerID="cri-o://61cf56781e7162cf823930e3e3d0b49ad95f64dcee4252339d9cabb15b161200" gracePeriod=10 Oct 05 20:33:58 crc kubenswrapper[4753]: I1005 20:33:58.906072 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" podUID="9bcf51ef-4faa-4209-bbf8-b17d7177d0b5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.174:5353: connect: connection refused" Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.140534 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de29aa2f-3662-4b9b-a32d-f6e0da626cb6","Type":"ContainerStarted","Data":"a7ec303068f89f1d49f0a5667920911c10d5c5c7d4419d9c47146e9fb2952ca6"} Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.140589 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de29aa2f-3662-4b9b-a32d-f6e0da626cb6","Type":"ContainerStarted","Data":"7aaa8ebf2ca481f8bcd382dcb6b0fcc15d54dec9d2a359984262ae9283c3bf9d"} Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.148236 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2cbwk" event={"ID":"25d85233-42bd-4ee4-9d31-c0de142846b3","Type":"ContainerStarted","Data":"12598f685c16233aa70f1178fe26d38534b0986da8a8be2ec58412701847578b"} Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.148275 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2cbwk" event={"ID":"25d85233-42bd-4ee4-9d31-c0de142846b3","Type":"ContainerStarted","Data":"ec37e8c921017324ab88112040235ea030f2f296b65c19ba7cc4d850bce77492"} Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.169218 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-2cbwk" podStartSLOduration=2.169200697 podStartE2EDuration="2.169200697s" podCreationTimestamp="2025-10-05 20:33:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:33:59.16250213 +0000 UTC m=+1148.010830362" watchObservedRunningTime="2025-10-05 20:33:59.169200697 +0000 UTC m=+1148.017528929" Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.172616 4753 generic.go:334] "Generic (PLEG): container finished" podID="9bcf51ef-4faa-4209-bbf8-b17d7177d0b5" containerID="61cf56781e7162cf823930e3e3d0b49ad95f64dcee4252339d9cabb15b161200" exitCode=0 Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.172705 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" event={"ID":"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5","Type":"ContainerDied","Data":"61cf56781e7162cf823930e3e3d0b49ad95f64dcee4252339d9cabb15b161200"} Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.448978 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.609063 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-ovsdbserver-nb\") pod \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.609114 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-config\") pod \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.609208 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-ovsdbserver-sb\") pod \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.609223 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-dns-svc\") pod \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.609254 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45qmh\" (UniqueName: \"kubernetes.io/projected/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-kube-api-access-45qmh\") pod \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\" (UID: \"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5\") " Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.615278 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-kube-api-access-45qmh" (OuterVolumeSpecName: "kube-api-access-45qmh") pod "9bcf51ef-4faa-4209-bbf8-b17d7177d0b5" (UID: "9bcf51ef-4faa-4209-bbf8-b17d7177d0b5"). InnerVolumeSpecName "kube-api-access-45qmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.669755 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9bcf51ef-4faa-4209-bbf8-b17d7177d0b5" (UID: "9bcf51ef-4faa-4209-bbf8-b17d7177d0b5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.682729 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-config" (OuterVolumeSpecName: "config") pod "9bcf51ef-4faa-4209-bbf8-b17d7177d0b5" (UID: "9bcf51ef-4faa-4209-bbf8-b17d7177d0b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.683591 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9bcf51ef-4faa-4209-bbf8-b17d7177d0b5" (UID: "9bcf51ef-4faa-4209-bbf8-b17d7177d0b5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.710543 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9bcf51ef-4faa-4209-bbf8-b17d7177d0b5" (UID: "9bcf51ef-4faa-4209-bbf8-b17d7177d0b5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.711633 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.711683 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.711692 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.711701 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 05 20:33:59 crc kubenswrapper[4753]: I1005 20:33:59.711709 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45qmh\" (UniqueName: \"kubernetes.io/projected/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5-kube-api-access-45qmh\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:00 crc kubenswrapper[4753]: I1005 20:34:00.199486 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" Oct 05 20:34:00 crc kubenswrapper[4753]: I1005 20:34:00.201103 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b86468d5c-ghpzp" event={"ID":"9bcf51ef-4faa-4209-bbf8-b17d7177d0b5","Type":"ContainerDied","Data":"1ba09cfc3f029affaee102d88168273ad1ea1fa79163b1a6fe243d1258d745f5"} Oct 05 20:34:00 crc kubenswrapper[4753]: I1005 20:34:00.201164 4753 scope.go:117] "RemoveContainer" containerID="61cf56781e7162cf823930e3e3d0b49ad95f64dcee4252339d9cabb15b161200" Oct 05 20:34:00 crc kubenswrapper[4753]: I1005 20:34:00.226532 4753 scope.go:117] "RemoveContainer" containerID="b60e4d3dbb25d2cdb7f2c61ebedbaf573e141c1737917000b62dcac0181a8aaa" Oct 05 20:34:00 crc kubenswrapper[4753]: I1005 20:34:00.233869 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b86468d5c-ghpzp"] Oct 05 20:34:00 crc kubenswrapper[4753]: I1005 20:34:00.241418 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b86468d5c-ghpzp"] Oct 05 20:34:01 crc kubenswrapper[4753]: I1005 20:34:01.211771 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de29aa2f-3662-4b9b-a32d-f6e0da626cb6","Type":"ContainerStarted","Data":"8e2d9dada88a876eb4e5d5885fbd9a554538e38f4c502ecba524608791bb8253"} Oct 05 20:34:01 crc kubenswrapper[4753]: I1005 20:34:01.212076 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 05 20:34:01 crc kubenswrapper[4753]: I1005 20:34:01.242209 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.945411175 podStartE2EDuration="5.242189368s" podCreationTimestamp="2025-10-05 20:33:56 +0000 UTC" firstStartedPulling="2025-10-05 20:33:57.013119241 +0000 UTC m=+1145.861447473" lastFinishedPulling="2025-10-05 20:34:00.309897434 +0000 UTC m=+1149.158225666" observedRunningTime="2025-10-05 20:34:01.238356638 +0000 UTC m=+1150.086684890" watchObservedRunningTime="2025-10-05 20:34:01.242189368 +0000 UTC m=+1150.090517600" Oct 05 20:34:01 crc kubenswrapper[4753]: I1005 20:34:01.888098 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bcf51ef-4faa-4209-bbf8-b17d7177d0b5" path="/var/lib/kubelet/pods/9bcf51ef-4faa-4209-bbf8-b17d7177d0b5/volumes" Oct 05 20:34:04 crc kubenswrapper[4753]: I1005 20:34:04.238308 4753 generic.go:334] "Generic (PLEG): container finished" podID="25d85233-42bd-4ee4-9d31-c0de142846b3" containerID="12598f685c16233aa70f1178fe26d38534b0986da8a8be2ec58412701847578b" exitCode=0 Oct 05 20:34:04 crc kubenswrapper[4753]: I1005 20:34:04.238381 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2cbwk" event={"ID":"25d85233-42bd-4ee4-9d31-c0de142846b3","Type":"ContainerDied","Data":"12598f685c16233aa70f1178fe26d38534b0986da8a8be2ec58412701847578b"} Oct 05 20:34:04 crc kubenswrapper[4753]: I1005 20:34:04.489809 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:34:04 crc kubenswrapper[4753]: I1005 20:34:04.489870 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:34:05 crc kubenswrapper[4753]: I1005 20:34:05.619683 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2cbwk" Oct 05 20:34:05 crc kubenswrapper[4753]: I1005 20:34:05.740248 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj6km\" (UniqueName: \"kubernetes.io/projected/25d85233-42bd-4ee4-9d31-c0de142846b3-kube-api-access-dj6km\") pod \"25d85233-42bd-4ee4-9d31-c0de142846b3\" (UID: \"25d85233-42bd-4ee4-9d31-c0de142846b3\") " Oct 05 20:34:05 crc kubenswrapper[4753]: I1005 20:34:05.740350 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-scripts\") pod \"25d85233-42bd-4ee4-9d31-c0de142846b3\" (UID: \"25d85233-42bd-4ee4-9d31-c0de142846b3\") " Oct 05 20:34:05 crc kubenswrapper[4753]: I1005 20:34:05.740513 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-combined-ca-bundle\") pod \"25d85233-42bd-4ee4-9d31-c0de142846b3\" (UID: \"25d85233-42bd-4ee4-9d31-c0de142846b3\") " Oct 05 20:34:05 crc kubenswrapper[4753]: I1005 20:34:05.740644 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-config-data\") pod \"25d85233-42bd-4ee4-9d31-c0de142846b3\" (UID: \"25d85233-42bd-4ee4-9d31-c0de142846b3\") " Oct 05 20:34:05 crc kubenswrapper[4753]: I1005 20:34:05.747549 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-scripts" (OuterVolumeSpecName: "scripts") pod "25d85233-42bd-4ee4-9d31-c0de142846b3" (UID: "25d85233-42bd-4ee4-9d31-c0de142846b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:34:05 crc kubenswrapper[4753]: I1005 20:34:05.747612 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d85233-42bd-4ee4-9d31-c0de142846b3-kube-api-access-dj6km" (OuterVolumeSpecName: "kube-api-access-dj6km") pod "25d85233-42bd-4ee4-9d31-c0de142846b3" (UID: "25d85233-42bd-4ee4-9d31-c0de142846b3"). InnerVolumeSpecName "kube-api-access-dj6km". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:34:05 crc kubenswrapper[4753]: I1005 20:34:05.772133 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-config-data" (OuterVolumeSpecName: "config-data") pod "25d85233-42bd-4ee4-9d31-c0de142846b3" (UID: "25d85233-42bd-4ee4-9d31-c0de142846b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:34:05 crc kubenswrapper[4753]: I1005 20:34:05.778920 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25d85233-42bd-4ee4-9d31-c0de142846b3" (UID: "25d85233-42bd-4ee4-9d31-c0de142846b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:34:05 crc kubenswrapper[4753]: I1005 20:34:05.876061 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:05 crc kubenswrapper[4753]: I1005 20:34:05.876101 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:05 crc kubenswrapper[4753]: I1005 20:34:05.876115 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj6km\" (UniqueName: \"kubernetes.io/projected/25d85233-42bd-4ee4-9d31-c0de142846b3-kube-api-access-dj6km\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:05 crc kubenswrapper[4753]: I1005 20:34:05.876128 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d85233-42bd-4ee4-9d31-c0de142846b3-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:06 crc kubenswrapper[4753]: I1005 20:34:06.258296 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2cbwk" event={"ID":"25d85233-42bd-4ee4-9d31-c0de142846b3","Type":"ContainerDied","Data":"ec37e8c921017324ab88112040235ea030f2f296b65c19ba7cc4d850bce77492"} Oct 05 20:34:06 crc kubenswrapper[4753]: I1005 20:34:06.258333 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec37e8c921017324ab88112040235ea030f2f296b65c19ba7cc4d850bce77492" Oct 05 20:34:06 crc kubenswrapper[4753]: I1005 20:34:06.258383 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2cbwk" Oct 05 20:34:06 crc kubenswrapper[4753]: I1005 20:34:06.444700 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:34:06 crc kubenswrapper[4753]: I1005 20:34:06.445383 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="009e7aaf-1c44-473c-97ef-9e0422e87607" containerName="nova-api-log" containerID="cri-o://5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e" gracePeriod=30 Oct 05 20:34:06 crc kubenswrapper[4753]: I1005 20:34:06.445484 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="009e7aaf-1c44-473c-97ef-9e0422e87607" containerName="nova-api-api" containerID="cri-o://ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d" gracePeriod=30 Oct 05 20:34:06 crc kubenswrapper[4753]: I1005 20:34:06.470457 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 05 20:34:06 crc kubenswrapper[4753]: I1005 20:34:06.470698 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="87d65699-7e60-444b-aefb-d5d80bf24404" containerName="nova-scheduler-scheduler" containerID="cri-o://3d2bc67b44b4e12fa9d832b6c88ff29a0d016ca2468f8307fbf8895b770a6ec5" gracePeriod=30 Oct 05 20:34:06 crc kubenswrapper[4753]: I1005 20:34:06.495109 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:34:06 crc kubenswrapper[4753]: I1005 20:34:06.496292 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" containerName="nova-metadata-log" containerID="cri-o://ec4418cbdb10776cb511fea8368349ae7e960bec17a95b7371c061ab2a4d4eee" gracePeriod=30 Oct 05 20:34:06 crc kubenswrapper[4753]: I1005 20:34:06.496459 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" containerName="nova-metadata-metadata" containerID="cri-o://0b32bba98d152b9756a5785b8f17a9dd6b7f69f421f0f42cdf885600ed3b23b7" gracePeriod=30 Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.004632 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.200120 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-public-tls-certs\") pod \"009e7aaf-1c44-473c-97ef-9e0422e87607\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.200249 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx4fn\" (UniqueName: \"kubernetes.io/projected/009e7aaf-1c44-473c-97ef-9e0422e87607-kube-api-access-gx4fn\") pod \"009e7aaf-1c44-473c-97ef-9e0422e87607\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.200295 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/009e7aaf-1c44-473c-97ef-9e0422e87607-logs\") pod \"009e7aaf-1c44-473c-97ef-9e0422e87607\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.200375 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-internal-tls-certs\") pod \"009e7aaf-1c44-473c-97ef-9e0422e87607\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.200899 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/009e7aaf-1c44-473c-97ef-9e0422e87607-logs" (OuterVolumeSpecName: "logs") pod "009e7aaf-1c44-473c-97ef-9e0422e87607" (UID: "009e7aaf-1c44-473c-97ef-9e0422e87607"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.201020 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-combined-ca-bundle\") pod \"009e7aaf-1c44-473c-97ef-9e0422e87607\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.201380 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-config-data\") pod \"009e7aaf-1c44-473c-97ef-9e0422e87607\" (UID: \"009e7aaf-1c44-473c-97ef-9e0422e87607\") " Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.201703 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/009e7aaf-1c44-473c-97ef-9e0422e87607-logs\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.204962 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/009e7aaf-1c44-473c-97ef-9e0422e87607-kube-api-access-gx4fn" (OuterVolumeSpecName: "kube-api-access-gx4fn") pod "009e7aaf-1c44-473c-97ef-9e0422e87607" (UID: "009e7aaf-1c44-473c-97ef-9e0422e87607"). InnerVolumeSpecName "kube-api-access-gx4fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.237231 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-config-data" (OuterVolumeSpecName: "config-data") pod "009e7aaf-1c44-473c-97ef-9e0422e87607" (UID: "009e7aaf-1c44-473c-97ef-9e0422e87607"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.237495 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "009e7aaf-1c44-473c-97ef-9e0422e87607" (UID: "009e7aaf-1c44-473c-97ef-9e0422e87607"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.253368 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "009e7aaf-1c44-473c-97ef-9e0422e87607" (UID: "009e7aaf-1c44-473c-97ef-9e0422e87607"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.255975 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "009e7aaf-1c44-473c-97ef-9e0422e87607" (UID: "009e7aaf-1c44-473c-97ef-9e0422e87607"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.270018 4753 generic.go:334] "Generic (PLEG): container finished" podID="d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" containerID="ec4418cbdb10776cb511fea8368349ae7e960bec17a95b7371c061ab2a4d4eee" exitCode=143 Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.270089 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0","Type":"ContainerDied","Data":"ec4418cbdb10776cb511fea8368349ae7e960bec17a95b7371c061ab2a4d4eee"} Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.271560 4753 generic.go:334] "Generic (PLEG): container finished" podID="009e7aaf-1c44-473c-97ef-9e0422e87607" containerID="ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d" exitCode=0 Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.271583 4753 generic.go:334] "Generic (PLEG): container finished" podID="009e7aaf-1c44-473c-97ef-9e0422e87607" containerID="5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e" exitCode=143 Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.271603 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"009e7aaf-1c44-473c-97ef-9e0422e87607","Type":"ContainerDied","Data":"ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d"} Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.271623 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"009e7aaf-1c44-473c-97ef-9e0422e87607","Type":"ContainerDied","Data":"5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e"} Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.271635 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"009e7aaf-1c44-473c-97ef-9e0422e87607","Type":"ContainerDied","Data":"9aacac02c52ab01d9c79cddb83d8c32b8f0305cfe511aa2a570ae71475acc2b0"} Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.271655 4753 scope.go:117] "RemoveContainer" containerID="ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.271790 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.309622 4753 scope.go:117] "RemoveContainer" containerID="5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.310329 4753 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.310361 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx4fn\" (UniqueName: \"kubernetes.io/projected/009e7aaf-1c44-473c-97ef-9e0422e87607-kube-api-access-gx4fn\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.310372 4753 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.310382 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.310391 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009e7aaf-1c44-473c-97ef-9e0422e87607-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.331843 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.348087 4753 scope.go:117] "RemoveContainer" containerID="ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d" Oct 05 20:34:07 crc kubenswrapper[4753]: E1005 20:34:07.348888 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d\": container with ID starting with ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d not found: ID does not exist" containerID="ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.348925 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d"} err="failed to get container status \"ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d\": rpc error: code = NotFound desc = could not find container \"ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d\": container with ID starting with ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d not found: ID does not exist" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.348952 4753 scope.go:117] "RemoveContainer" containerID="5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e" Oct 05 20:34:07 crc kubenswrapper[4753]: E1005 20:34:07.349281 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e\": container with ID starting with 5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e not found: ID does not exist" containerID="5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.349312 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e"} err="failed to get container status \"5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e\": rpc error: code = NotFound desc = could not find container \"5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e\": container with ID starting with 5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e not found: ID does not exist" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.349330 4753 scope.go:117] "RemoveContainer" containerID="ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.349407 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.349561 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d"} err="failed to get container status \"ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d\": rpc error: code = NotFound desc = could not find container \"ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d\": container with ID starting with ae19087b53a032e5876b4e26f05c8e978b6d71316b17a14ea5fec32d8552489d not found: ID does not exist" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.349590 4753 scope.go:117] "RemoveContainer" containerID="5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.349916 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e"} err="failed to get container status \"5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e\": rpc error: code = NotFound desc = could not find container \"5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e\": container with ID starting with 5b84f9ef830f8f5d03a41821461f20c7eb61eb9e4b46d518d7274c59c3d39e4e not found: ID does not exist" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.374455 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 05 20:34:07 crc kubenswrapper[4753]: E1005 20:34:07.374881 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009e7aaf-1c44-473c-97ef-9e0422e87607" containerName="nova-api-log" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.374898 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="009e7aaf-1c44-473c-97ef-9e0422e87607" containerName="nova-api-log" Oct 05 20:34:07 crc kubenswrapper[4753]: E1005 20:34:07.374918 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009e7aaf-1c44-473c-97ef-9e0422e87607" containerName="nova-api-api" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.374925 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="009e7aaf-1c44-473c-97ef-9e0422e87607" containerName="nova-api-api" Oct 05 20:34:07 crc kubenswrapper[4753]: E1005 20:34:07.374932 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bcf51ef-4faa-4209-bbf8-b17d7177d0b5" containerName="dnsmasq-dns" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.374938 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bcf51ef-4faa-4209-bbf8-b17d7177d0b5" containerName="dnsmasq-dns" Oct 05 20:34:07 crc kubenswrapper[4753]: E1005 20:34:07.374947 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d85233-42bd-4ee4-9d31-c0de142846b3" containerName="nova-manage" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.374954 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d85233-42bd-4ee4-9d31-c0de142846b3" containerName="nova-manage" Oct 05 20:34:07 crc kubenswrapper[4753]: E1005 20:34:07.374967 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bcf51ef-4faa-4209-bbf8-b17d7177d0b5" containerName="init" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.374973 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bcf51ef-4faa-4209-bbf8-b17d7177d0b5" containerName="init" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.375171 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bcf51ef-4faa-4209-bbf8-b17d7177d0b5" containerName="dnsmasq-dns" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.375183 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d85233-42bd-4ee4-9d31-c0de142846b3" containerName="nova-manage" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.375200 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="009e7aaf-1c44-473c-97ef-9e0422e87607" containerName="nova-api-api" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.375209 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="009e7aaf-1c44-473c-97ef-9e0422e87607" containerName="nova-api-log" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.376233 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.379082 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.379318 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.379451 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.386202 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.410990 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd2971d2-d61f-4268-9366-6e11ae7f71bc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.411125 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvsn8\" (UniqueName: \"kubernetes.io/projected/fd2971d2-d61f-4268-9366-6e11ae7f71bc-kube-api-access-qvsn8\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.411183 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd2971d2-d61f-4268-9366-6e11ae7f71bc-public-tls-certs\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.411292 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd2971d2-d61f-4268-9366-6e11ae7f71bc-logs\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.411338 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd2971d2-d61f-4268-9366-6e11ae7f71bc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.411414 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd2971d2-d61f-4268-9366-6e11ae7f71bc-config-data\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.513586 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd2971d2-d61f-4268-9366-6e11ae7f71bc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.513653 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvsn8\" (UniqueName: \"kubernetes.io/projected/fd2971d2-d61f-4268-9366-6e11ae7f71bc-kube-api-access-qvsn8\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.513679 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd2971d2-d61f-4268-9366-6e11ae7f71bc-public-tls-certs\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.513729 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd2971d2-d61f-4268-9366-6e11ae7f71bc-logs\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.513767 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd2971d2-d61f-4268-9366-6e11ae7f71bc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.513817 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd2971d2-d61f-4268-9366-6e11ae7f71bc-config-data\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.514394 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd2971d2-d61f-4268-9366-6e11ae7f71bc-logs\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.516701 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd2971d2-d61f-4268-9366-6e11ae7f71bc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.516946 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd2971d2-d61f-4268-9366-6e11ae7f71bc-public-tls-certs\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.517354 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd2971d2-d61f-4268-9366-6e11ae7f71bc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.518074 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd2971d2-d61f-4268-9366-6e11ae7f71bc-config-data\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.530063 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvsn8\" (UniqueName: \"kubernetes.io/projected/fd2971d2-d61f-4268-9366-6e11ae7f71bc-kube-api-access-qvsn8\") pod \"nova-api-0\" (UID: \"fd2971d2-d61f-4268-9366-6e11ae7f71bc\") " pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.698985 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 05 20:34:07 crc kubenswrapper[4753]: I1005 20:34:07.874842 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="009e7aaf-1c44-473c-97ef-9e0422e87607" path="/var/lib/kubelet/pods/009e7aaf-1c44-473c-97ef-9e0422e87607/volumes" Oct 05 20:34:08 crc kubenswrapper[4753]: W1005 20:34:08.190808 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd2971d2_d61f_4268_9366_6e11ae7f71bc.slice/crio-37a5a7f9e53020592eb134c3cd65a941fe87b2e32354a538e9281c186a06c691 WatchSource:0}: Error finding container 37a5a7f9e53020592eb134c3cd65a941fe87b2e32354a538e9281c186a06c691: Status 404 returned error can't find the container with id 37a5a7f9e53020592eb134c3cd65a941fe87b2e32354a538e9281c186a06c691 Oct 05 20:34:08 crc kubenswrapper[4753]: I1005 20:34:08.192086 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 05 20:34:08 crc kubenswrapper[4753]: I1005 20:34:08.282215 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fd2971d2-d61f-4268-9366-6e11ae7f71bc","Type":"ContainerStarted","Data":"37a5a7f9e53020592eb134c3cd65a941fe87b2e32354a538e9281c186a06c691"} Oct 05 20:34:08 crc kubenswrapper[4753]: I1005 20:34:08.306267 4753 generic.go:334] "Generic (PLEG): container finished" podID="87d65699-7e60-444b-aefb-d5d80bf24404" containerID="3d2bc67b44b4e12fa9d832b6c88ff29a0d016ca2468f8307fbf8895b770a6ec5" exitCode=0 Oct 05 20:34:08 crc kubenswrapper[4753]: I1005 20:34:08.306337 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87d65699-7e60-444b-aefb-d5d80bf24404","Type":"ContainerDied","Data":"3d2bc67b44b4e12fa9d832b6c88ff29a0d016ca2468f8307fbf8895b770a6ec5"} Oct 05 20:34:08 crc kubenswrapper[4753]: I1005 20:34:08.407046 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 05 20:34:08 crc kubenswrapper[4753]: I1005 20:34:08.436011 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87d65699-7e60-444b-aefb-d5d80bf24404-config-data\") pod \"87d65699-7e60-444b-aefb-d5d80bf24404\" (UID: \"87d65699-7e60-444b-aefb-d5d80bf24404\") " Oct 05 20:34:08 crc kubenswrapper[4753]: I1005 20:34:08.436094 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87d65699-7e60-444b-aefb-d5d80bf24404-combined-ca-bundle\") pod \"87d65699-7e60-444b-aefb-d5d80bf24404\" (UID: \"87d65699-7e60-444b-aefb-d5d80bf24404\") " Oct 05 20:34:08 crc kubenswrapper[4753]: I1005 20:34:08.436216 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62zcb\" (UniqueName: \"kubernetes.io/projected/87d65699-7e60-444b-aefb-d5d80bf24404-kube-api-access-62zcb\") pod \"87d65699-7e60-444b-aefb-d5d80bf24404\" (UID: \"87d65699-7e60-444b-aefb-d5d80bf24404\") " Oct 05 20:34:08 crc kubenswrapper[4753]: I1005 20:34:08.449512 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87d65699-7e60-444b-aefb-d5d80bf24404-kube-api-access-62zcb" (OuterVolumeSpecName: "kube-api-access-62zcb") pod "87d65699-7e60-444b-aefb-d5d80bf24404" (UID: "87d65699-7e60-444b-aefb-d5d80bf24404"). InnerVolumeSpecName "kube-api-access-62zcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:34:08 crc kubenswrapper[4753]: I1005 20:34:08.481676 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87d65699-7e60-444b-aefb-d5d80bf24404-config-data" (OuterVolumeSpecName: "config-data") pod "87d65699-7e60-444b-aefb-d5d80bf24404" (UID: "87d65699-7e60-444b-aefb-d5d80bf24404"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:34:08 crc kubenswrapper[4753]: I1005 20:34:08.484890 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87d65699-7e60-444b-aefb-d5d80bf24404-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87d65699-7e60-444b-aefb-d5d80bf24404" (UID: "87d65699-7e60-444b-aefb-d5d80bf24404"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:34:08 crc kubenswrapper[4753]: I1005 20:34:08.537392 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87d65699-7e60-444b-aefb-d5d80bf24404-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:08 crc kubenswrapper[4753]: I1005 20:34:08.537424 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87d65699-7e60-444b-aefb-d5d80bf24404-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:08 crc kubenswrapper[4753]: I1005 20:34:08.537438 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62zcb\" (UniqueName: \"kubernetes.io/projected/87d65699-7e60-444b-aefb-d5d80bf24404-kube-api-access-62zcb\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.315894 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fd2971d2-d61f-4268-9366-6e11ae7f71bc","Type":"ContainerStarted","Data":"2f83fcd5711ef3729e538be000c6dfe0a0f713ed75ba3bd43b911f2b743fb68d"} Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.316344 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fd2971d2-d61f-4268-9366-6e11ae7f71bc","Type":"ContainerStarted","Data":"9e8d3f6c24ab449efc6f61d60648fa356bdd7bd272e744045865b68b8a16caea"} Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.318411 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87d65699-7e60-444b-aefb-d5d80bf24404","Type":"ContainerDied","Data":"42fdcedd0e1d16be753237d8a832586f56601edd3e89b647a08b1334d907b401"} Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.318470 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.318466 4753 scope.go:117] "RemoveContainer" containerID="3d2bc67b44b4e12fa9d832b6c88ff29a0d016ca2468f8307fbf8895b770a6ec5" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.344316 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.344293391 podStartE2EDuration="2.344293391s" podCreationTimestamp="2025-10-05 20:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:34:09.335070735 +0000 UTC m=+1158.183398997" watchObservedRunningTime="2025-10-05 20:34:09.344293391 +0000 UTC m=+1158.192621623" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.362718 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.369306 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.388345 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 05 20:34:09 crc kubenswrapper[4753]: E1005 20:34:09.389008 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87d65699-7e60-444b-aefb-d5d80bf24404" containerName="nova-scheduler-scheduler" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.389034 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="87d65699-7e60-444b-aefb-d5d80bf24404" containerName="nova-scheduler-scheduler" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.389236 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="87d65699-7e60-444b-aefb-d5d80bf24404" containerName="nova-scheduler-scheduler" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.391192 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.393177 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.399830 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.451516 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch9b4\" (UniqueName: \"kubernetes.io/projected/346a135f-f1af-4968-9c9f-4540f2a71161-kube-api-access-ch9b4\") pod \"nova-scheduler-0\" (UID: \"346a135f-f1af-4968-9c9f-4540f2a71161\") " pod="openstack/nova-scheduler-0" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.451559 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346a135f-f1af-4968-9c9f-4540f2a71161-config-data\") pod \"nova-scheduler-0\" (UID: \"346a135f-f1af-4968-9c9f-4540f2a71161\") " pod="openstack/nova-scheduler-0" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.451627 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346a135f-f1af-4968-9c9f-4540f2a71161-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"346a135f-f1af-4968-9c9f-4540f2a71161\") " pod="openstack/nova-scheduler-0" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.553019 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346a135f-f1af-4968-9c9f-4540f2a71161-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"346a135f-f1af-4968-9c9f-4540f2a71161\") " pod="openstack/nova-scheduler-0" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.553118 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch9b4\" (UniqueName: \"kubernetes.io/projected/346a135f-f1af-4968-9c9f-4540f2a71161-kube-api-access-ch9b4\") pod \"nova-scheduler-0\" (UID: \"346a135f-f1af-4968-9c9f-4540f2a71161\") " pod="openstack/nova-scheduler-0" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.553161 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346a135f-f1af-4968-9c9f-4540f2a71161-config-data\") pod \"nova-scheduler-0\" (UID: \"346a135f-f1af-4968-9c9f-4540f2a71161\") " pod="openstack/nova-scheduler-0" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.557574 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346a135f-f1af-4968-9c9f-4540f2a71161-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"346a135f-f1af-4968-9c9f-4540f2a71161\") " pod="openstack/nova-scheduler-0" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.558960 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346a135f-f1af-4968-9c9f-4540f2a71161-config-data\") pod \"nova-scheduler-0\" (UID: \"346a135f-f1af-4968-9c9f-4540f2a71161\") " pod="openstack/nova-scheduler-0" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.568062 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch9b4\" (UniqueName: \"kubernetes.io/projected/346a135f-f1af-4968-9c9f-4540f2a71161-kube-api-access-ch9b4\") pod \"nova-scheduler-0\" (UID: \"346a135f-f1af-4968-9c9f-4540f2a71161\") " pod="openstack/nova-scheduler-0" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.639928 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": read tcp 10.217.0.2:51226->10.217.0.178:8775: read: connection reset by peer" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.639979 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": read tcp 10.217.0.2:51214->10.217.0.178:8775: read: connection reset by peer" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.708175 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 05 20:34:09 crc kubenswrapper[4753]: I1005 20:34:09.867399 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87d65699-7e60-444b-aefb-d5d80bf24404" path="/var/lib/kubelet/pods/87d65699-7e60-444b-aefb-d5d80bf24404/volumes" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.030318 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.164444 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfrwf\" (UniqueName: \"kubernetes.io/projected/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-kube-api-access-lfrwf\") pod \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.164514 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-logs\") pod \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.164568 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-config-data\") pod \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.164589 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-nova-metadata-tls-certs\") pod \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.164634 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-combined-ca-bundle\") pod \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\" (UID: \"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0\") " Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.165768 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-logs" (OuterVolumeSpecName: "logs") pod "d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" (UID: "d8afbf63-cd8b-483a-a533-ccddc2c3ebc0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.166075 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-logs\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.169922 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-kube-api-access-lfrwf" (OuterVolumeSpecName: "kube-api-access-lfrwf") pod "d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" (UID: "d8afbf63-cd8b-483a-a533-ccddc2c3ebc0"). InnerVolumeSpecName "kube-api-access-lfrwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.191129 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" (UID: "d8afbf63-cd8b-483a-a533-ccddc2c3ebc0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.193363 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-config-data" (OuterVolumeSpecName: "config-data") pod "d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" (UID: "d8afbf63-cd8b-483a-a533-ccddc2c3ebc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:34:10 crc kubenswrapper[4753]: W1005 20:34:10.216520 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod346a135f_f1af_4968_9c9f_4540f2a71161.slice/crio-392235de3fdd507b8f3ae45d3b5026638b1852540f2b5ac0d4625ff932afb243 WatchSource:0}: Error finding container 392235de3fdd507b8f3ae45d3b5026638b1852540f2b5ac0d4625ff932afb243: Status 404 returned error can't find the container with id 392235de3fdd507b8f3ae45d3b5026638b1852540f2b5ac0d4625ff932afb243 Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.218339 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.227236 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" (UID: "d8afbf63-cd8b-483a-a533-ccddc2c3ebc0"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.266868 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfrwf\" (UniqueName: \"kubernetes.io/projected/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-kube-api-access-lfrwf\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.266984 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.267062 4753 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.267126 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.329352 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"346a135f-f1af-4968-9c9f-4540f2a71161","Type":"ContainerStarted","Data":"392235de3fdd507b8f3ae45d3b5026638b1852540f2b5ac0d4625ff932afb243"} Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.340796 4753 generic.go:334] "Generic (PLEG): container finished" podID="d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" containerID="0b32bba98d152b9756a5785b8f17a9dd6b7f69f421f0f42cdf885600ed3b23b7" exitCode=0 Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.341848 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.345009 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0","Type":"ContainerDied","Data":"0b32bba98d152b9756a5785b8f17a9dd6b7f69f421f0f42cdf885600ed3b23b7"} Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.345066 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8afbf63-cd8b-483a-a533-ccddc2c3ebc0","Type":"ContainerDied","Data":"61675852914e162a8bbcd198ca0726078d8c460e10beb8c21a0239fb5dbf7393"} Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.345091 4753 scope.go:117] "RemoveContainer" containerID="0b32bba98d152b9756a5785b8f17a9dd6b7f69f421f0f42cdf885600ed3b23b7" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.366525 4753 scope.go:117] "RemoveContainer" containerID="ec4418cbdb10776cb511fea8368349ae7e960bec17a95b7371c061ab2a4d4eee" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.387455 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.396567 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.404553 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:34:10 crc kubenswrapper[4753]: E1005 20:34:10.406112 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" containerName="nova-metadata-log" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.406131 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" containerName="nova-metadata-log" Oct 05 20:34:10 crc kubenswrapper[4753]: E1005 20:34:10.406164 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" containerName="nova-metadata-metadata" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.406171 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" containerName="nova-metadata-metadata" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.406324 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" containerName="nova-metadata-log" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.406341 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" containerName="nova-metadata-metadata" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.407337 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.410745 4753 scope.go:117] "RemoveContainer" containerID="0b32bba98d152b9756a5785b8f17a9dd6b7f69f421f0f42cdf885600ed3b23b7" Oct 05 20:34:10 crc kubenswrapper[4753]: E1005 20:34:10.411325 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b32bba98d152b9756a5785b8f17a9dd6b7f69f421f0f42cdf885600ed3b23b7\": container with ID starting with 0b32bba98d152b9756a5785b8f17a9dd6b7f69f421f0f42cdf885600ed3b23b7 not found: ID does not exist" containerID="0b32bba98d152b9756a5785b8f17a9dd6b7f69f421f0f42cdf885600ed3b23b7" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.411464 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b32bba98d152b9756a5785b8f17a9dd6b7f69f421f0f42cdf885600ed3b23b7"} err="failed to get container status \"0b32bba98d152b9756a5785b8f17a9dd6b7f69f421f0f42cdf885600ed3b23b7\": rpc error: code = NotFound desc = could not find container \"0b32bba98d152b9756a5785b8f17a9dd6b7f69f421f0f42cdf885600ed3b23b7\": container with ID starting with 0b32bba98d152b9756a5785b8f17a9dd6b7f69f421f0f42cdf885600ed3b23b7 not found: ID does not exist" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.411559 4753 scope.go:117] "RemoveContainer" containerID="ec4418cbdb10776cb511fea8368349ae7e960bec17a95b7371c061ab2a4d4eee" Oct 05 20:34:10 crc kubenswrapper[4753]: E1005 20:34:10.411926 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec4418cbdb10776cb511fea8368349ae7e960bec17a95b7371c061ab2a4d4eee\": container with ID starting with ec4418cbdb10776cb511fea8368349ae7e960bec17a95b7371c061ab2a4d4eee not found: ID does not exist" containerID="ec4418cbdb10776cb511fea8368349ae7e960bec17a95b7371c061ab2a4d4eee" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.411982 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec4418cbdb10776cb511fea8368349ae7e960bec17a95b7371c061ab2a4d4eee"} err="failed to get container status \"ec4418cbdb10776cb511fea8368349ae7e960bec17a95b7371c061ab2a4d4eee\": rpc error: code = NotFound desc = could not find container \"ec4418cbdb10776cb511fea8368349ae7e960bec17a95b7371c061ab2a4d4eee\": container with ID starting with ec4418cbdb10776cb511fea8368349ae7e960bec17a95b7371c061ab2a4d4eee not found: ID does not exist" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.413568 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.413887 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.420220 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.475666 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3649c567-6d73-4afe-a1aa-d6621a5cc89f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3649c567-6d73-4afe-a1aa-d6621a5cc89f\") " pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.475957 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3649c567-6d73-4afe-a1aa-d6621a5cc89f-logs\") pod \"nova-metadata-0\" (UID: \"3649c567-6d73-4afe-a1aa-d6621a5cc89f\") " pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.476191 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdr4m\" (UniqueName: \"kubernetes.io/projected/3649c567-6d73-4afe-a1aa-d6621a5cc89f-kube-api-access-kdr4m\") pod \"nova-metadata-0\" (UID: \"3649c567-6d73-4afe-a1aa-d6621a5cc89f\") " pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.476307 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3649c567-6d73-4afe-a1aa-d6621a5cc89f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3649c567-6d73-4afe-a1aa-d6621a5cc89f\") " pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.476340 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3649c567-6d73-4afe-a1aa-d6621a5cc89f-config-data\") pod \"nova-metadata-0\" (UID: \"3649c567-6d73-4afe-a1aa-d6621a5cc89f\") " pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: E1005 20:34:10.548835 4753 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8afbf63_cd8b_483a_a533_ccddc2c3ebc0.slice/crio-61675852914e162a8bbcd198ca0726078d8c460e10beb8c21a0239fb5dbf7393\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8afbf63_cd8b_483a_a533_ccddc2c3ebc0.slice\": RecentStats: unable to find data in memory cache]" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.578358 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3649c567-6d73-4afe-a1aa-d6621a5cc89f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3649c567-6d73-4afe-a1aa-d6621a5cc89f\") " pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.578994 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3649c567-6d73-4afe-a1aa-d6621a5cc89f-logs\") pod \"nova-metadata-0\" (UID: \"3649c567-6d73-4afe-a1aa-d6621a5cc89f\") " pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.579524 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdr4m\" (UniqueName: \"kubernetes.io/projected/3649c567-6d73-4afe-a1aa-d6621a5cc89f-kube-api-access-kdr4m\") pod \"nova-metadata-0\" (UID: \"3649c567-6d73-4afe-a1aa-d6621a5cc89f\") " pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.579984 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3649c567-6d73-4afe-a1aa-d6621a5cc89f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3649c567-6d73-4afe-a1aa-d6621a5cc89f\") " pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.580452 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3649c567-6d73-4afe-a1aa-d6621a5cc89f-config-data\") pod \"nova-metadata-0\" (UID: \"3649c567-6d73-4afe-a1aa-d6621a5cc89f\") " pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.579462 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3649c567-6d73-4afe-a1aa-d6621a5cc89f-logs\") pod \"nova-metadata-0\" (UID: \"3649c567-6d73-4afe-a1aa-d6621a5cc89f\") " pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.583933 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3649c567-6d73-4afe-a1aa-d6621a5cc89f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3649c567-6d73-4afe-a1aa-d6621a5cc89f\") " pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.584040 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3649c567-6d73-4afe-a1aa-d6621a5cc89f-config-data\") pod \"nova-metadata-0\" (UID: \"3649c567-6d73-4afe-a1aa-d6621a5cc89f\") " pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.584243 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3649c567-6d73-4afe-a1aa-d6621a5cc89f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3649c567-6d73-4afe-a1aa-d6621a5cc89f\") " pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.596705 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdr4m\" (UniqueName: \"kubernetes.io/projected/3649c567-6d73-4afe-a1aa-d6621a5cc89f-kube-api-access-kdr4m\") pod \"nova-metadata-0\" (UID: \"3649c567-6d73-4afe-a1aa-d6621a5cc89f\") " pod="openstack/nova-metadata-0" Oct 05 20:34:10 crc kubenswrapper[4753]: I1005 20:34:10.782515 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 05 20:34:11 crc kubenswrapper[4753]: I1005 20:34:11.213788 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 05 20:34:11 crc kubenswrapper[4753]: W1005 20:34:11.230107 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3649c567_6d73_4afe_a1aa_d6621a5cc89f.slice/crio-62f158876803d40f5dc83808533df84a11ebeec3878977ab15c5241fe4d2f6b7 WatchSource:0}: Error finding container 62f158876803d40f5dc83808533df84a11ebeec3878977ab15c5241fe4d2f6b7: Status 404 returned error can't find the container with id 62f158876803d40f5dc83808533df84a11ebeec3878977ab15c5241fe4d2f6b7 Oct 05 20:34:11 crc kubenswrapper[4753]: I1005 20:34:11.354886 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"346a135f-f1af-4968-9c9f-4540f2a71161","Type":"ContainerStarted","Data":"1cf2919a372972203e9e52b10d93b91a0e9f44a08ebf568751dbb0cbceb9133f"} Oct 05 20:34:11 crc kubenswrapper[4753]: I1005 20:34:11.361787 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3649c567-6d73-4afe-a1aa-d6621a5cc89f","Type":"ContainerStarted","Data":"62f158876803d40f5dc83808533df84a11ebeec3878977ab15c5241fe4d2f6b7"} Oct 05 20:34:11 crc kubenswrapper[4753]: I1005 20:34:11.376117 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.376096125 podStartE2EDuration="2.376096125s" podCreationTimestamp="2025-10-05 20:34:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:34:11.37596432 +0000 UTC m=+1160.224292562" watchObservedRunningTime="2025-10-05 20:34:11.376096125 +0000 UTC m=+1160.224424357" Oct 05 20:34:11 crc kubenswrapper[4753]: I1005 20:34:11.863755 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8afbf63-cd8b-483a-a533-ccddc2c3ebc0" path="/var/lib/kubelet/pods/d8afbf63-cd8b-483a-a533-ccddc2c3ebc0/volumes" Oct 05 20:34:12 crc kubenswrapper[4753]: I1005 20:34:12.387913 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3649c567-6d73-4afe-a1aa-d6621a5cc89f","Type":"ContainerStarted","Data":"340a47717b56c273b81c4a6e0350f3dabee00ca3b8553075b553f7fc9243675b"} Oct 05 20:34:12 crc kubenswrapper[4753]: I1005 20:34:12.387973 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3649c567-6d73-4afe-a1aa-d6621a5cc89f","Type":"ContainerStarted","Data":"3eed17a28f10258abd7c9347d32f0319ad88c2d1f1c6f68cb3e827c9f6ea1948"} Oct 05 20:34:12 crc kubenswrapper[4753]: I1005 20:34:12.418315 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.418294066 podStartE2EDuration="2.418294066s" podCreationTimestamp="2025-10-05 20:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:34:12.411112513 +0000 UTC m=+1161.259440745" watchObservedRunningTime="2025-10-05 20:34:12.418294066 +0000 UTC m=+1161.266622298" Oct 05 20:34:14 crc kubenswrapper[4753]: I1005 20:34:14.708942 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 05 20:34:15 crc kubenswrapper[4753]: I1005 20:34:15.783516 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 05 20:34:15 crc kubenswrapper[4753]: I1005 20:34:15.783570 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 05 20:34:17 crc kubenswrapper[4753]: I1005 20:34:17.699375 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 05 20:34:17 crc kubenswrapper[4753]: I1005 20:34:17.699742 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 05 20:34:18 crc kubenswrapper[4753]: I1005 20:34:18.713358 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fd2971d2-d61f-4268-9366-6e11ae7f71bc" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 05 20:34:18 crc kubenswrapper[4753]: I1005 20:34:18.713348 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fd2971d2-d61f-4268-9366-6e11ae7f71bc" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.188:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 05 20:34:19 crc kubenswrapper[4753]: I1005 20:34:19.708837 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 05 20:34:19 crc kubenswrapper[4753]: I1005 20:34:19.743380 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 05 20:34:20 crc kubenswrapper[4753]: I1005 20:34:20.501480 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 05 20:34:20 crc kubenswrapper[4753]: I1005 20:34:20.783489 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 05 20:34:20 crc kubenswrapper[4753]: I1005 20:34:20.783758 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 05 20:34:21 crc kubenswrapper[4753]: I1005 20:34:21.796512 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3649c567-6d73-4afe-a1aa-d6621a5cc89f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 05 20:34:21 crc kubenswrapper[4753]: I1005 20:34:21.796527 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3649c567-6d73-4afe-a1aa-d6621a5cc89f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 05 20:34:26 crc kubenswrapper[4753]: I1005 20:34:26.511665 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 05 20:34:27 crc kubenswrapper[4753]: I1005 20:34:27.709462 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 05 20:34:27 crc kubenswrapper[4753]: I1005 20:34:27.710115 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 05 20:34:27 crc kubenswrapper[4753]: I1005 20:34:27.724731 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 05 20:34:27 crc kubenswrapper[4753]: I1005 20:34:27.730196 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 05 20:34:28 crc kubenswrapper[4753]: I1005 20:34:28.547184 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 05 20:34:28 crc kubenswrapper[4753]: I1005 20:34:28.554746 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 05 20:34:30 crc kubenswrapper[4753]: I1005 20:34:30.788900 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 05 20:34:30 crc kubenswrapper[4753]: I1005 20:34:30.794008 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 05 20:34:30 crc kubenswrapper[4753]: I1005 20:34:30.794696 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 05 20:34:31 crc kubenswrapper[4753]: I1005 20:34:31.590264 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 05 20:34:34 crc kubenswrapper[4753]: I1005 20:34:34.490052 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:34:34 crc kubenswrapper[4753]: I1005 20:34:34.490555 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:34:40 crc kubenswrapper[4753]: I1005 20:34:40.561474 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 05 20:34:41 crc kubenswrapper[4753]: I1005 20:34:41.580559 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 05 20:34:44 crc kubenswrapper[4753]: I1005 20:34:44.718016 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="73d182e9-8e4b-46ce-aa0b-6fd751eefecd" containerName="rabbitmq" containerID="cri-o://e838dbccfedc74c736ecacf9ded9f3acd151492c5726744b237bf41a76e69e42" gracePeriod=604796 Oct 05 20:34:45 crc kubenswrapper[4753]: I1005 20:34:45.480747 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="6f77d64e-7c7a-4770-9710-8c4aa767bcfa" containerName="rabbitmq" containerID="cri-o://527b88c7346a96619d1e411b8d16303db9c661761cf232b2dba5107da80d25a0" gracePeriod=604797 Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.306747 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.445903 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-pod-info\") pod \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.448464 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-config-data\") pod \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.448494 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-plugins\") pod \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.448541 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-confd\") pod \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.448561 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-server-conf\") pod \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.448584 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-erlang-cookie\") pod \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.448602 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-tls\") pod \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.448667 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-plugins-conf\") pod \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.448744 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d7wn\" (UniqueName: \"kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-kube-api-access-2d7wn\") pod \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.448762 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.448784 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-erlang-cookie-secret\") pod \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\" (UID: \"73d182e9-8e4b-46ce-aa0b-6fd751eefecd\") " Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.450652 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "73d182e9-8e4b-46ce-aa0b-6fd751eefecd" (UID: "73d182e9-8e4b-46ce-aa0b-6fd751eefecd"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.450862 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "73d182e9-8e4b-46ce-aa0b-6fd751eefecd" (UID: "73d182e9-8e4b-46ce-aa0b-6fd751eefecd"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.450922 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "73d182e9-8e4b-46ce-aa0b-6fd751eefecd" (UID: "73d182e9-8e4b-46ce-aa0b-6fd751eefecd"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.454858 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-pod-info" (OuterVolumeSpecName: "pod-info") pod "73d182e9-8e4b-46ce-aa0b-6fd751eefecd" (UID: "73d182e9-8e4b-46ce-aa0b-6fd751eefecd"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.456708 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "73d182e9-8e4b-46ce-aa0b-6fd751eefecd" (UID: "73d182e9-8e4b-46ce-aa0b-6fd751eefecd"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.459312 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "73d182e9-8e4b-46ce-aa0b-6fd751eefecd" (UID: "73d182e9-8e4b-46ce-aa0b-6fd751eefecd"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.482502 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "73d182e9-8e4b-46ce-aa0b-6fd751eefecd" (UID: "73d182e9-8e4b-46ce-aa0b-6fd751eefecd"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.494813 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-kube-api-access-2d7wn" (OuterVolumeSpecName: "kube-api-access-2d7wn") pod "73d182e9-8e4b-46ce-aa0b-6fd751eefecd" (UID: "73d182e9-8e4b-46ce-aa0b-6fd751eefecd"). InnerVolumeSpecName "kube-api-access-2d7wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.512103 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-config-data" (OuterVolumeSpecName: "config-data") pod "73d182e9-8e4b-46ce-aa0b-6fd751eefecd" (UID: "73d182e9-8e4b-46ce-aa0b-6fd751eefecd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.551388 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d7wn\" (UniqueName: \"kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-kube-api-access-2d7wn\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.551613 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.551737 4753 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.551823 4753 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-pod-info\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.551883 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.551945 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.552004 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.552067 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.552128 4753 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.555287 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-server-conf" (OuterVolumeSpecName: "server-conf") pod "73d182e9-8e4b-46ce-aa0b-6fd751eefecd" (UID: "73d182e9-8e4b-46ce-aa0b-6fd751eefecd"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.609693 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.629465 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "73d182e9-8e4b-46ce-aa0b-6fd751eefecd" (UID: "73d182e9-8e4b-46ce-aa0b-6fd751eefecd"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.631543 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="6f77d64e-7c7a-4770-9710-8c4aa767bcfa" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.653644 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.653673 4753 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/73d182e9-8e4b-46ce-aa0b-6fd751eefecd-server-conf\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.653682 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:51 crc kubenswrapper[4753]: E1005 20:34:51.665864 4753 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f77d64e_7c7a_4770_9710_8c4aa767bcfa.slice/crio-527b88c7346a96619d1e411b8d16303db9c661761cf232b2dba5107da80d25a0.scope\": RecentStats: unable to find data in memory cache]" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.805941 4753 generic.go:334] "Generic (PLEG): container finished" podID="73d182e9-8e4b-46ce-aa0b-6fd751eefecd" containerID="e838dbccfedc74c736ecacf9ded9f3acd151492c5726744b237bf41a76e69e42" exitCode=0 Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.806015 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"73d182e9-8e4b-46ce-aa0b-6fd751eefecd","Type":"ContainerDied","Data":"e838dbccfedc74c736ecacf9ded9f3acd151492c5726744b237bf41a76e69e42"} Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.806310 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"73d182e9-8e4b-46ce-aa0b-6fd751eefecd","Type":"ContainerDied","Data":"f10e1725cc723c29ef4bd7b39b13680e39522e023e3435dbc375a368b642b94c"} Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.806334 4753 scope.go:117] "RemoveContainer" containerID="e838dbccfedc74c736ecacf9ded9f3acd151492c5726744b237bf41a76e69e42" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.806053 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.810520 4753 generic.go:334] "Generic (PLEG): container finished" podID="6f77d64e-7c7a-4770-9710-8c4aa767bcfa" containerID="527b88c7346a96619d1e411b8d16303db9c661761cf232b2dba5107da80d25a0" exitCode=0 Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.810615 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f77d64e-7c7a-4770-9710-8c4aa767bcfa","Type":"ContainerDied","Data":"527b88c7346a96619d1e411b8d16303db9c661761cf232b2dba5107da80d25a0"} Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.888125 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.902730 4753 scope.go:117] "RemoveContainer" containerID="a730afd36443ffc2b80abbc60e2cca6017825e03868c24ebe4bb442fedc5ae24" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.911499 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 05 20:34:51 crc kubenswrapper[4753]: E1005 20:34:51.940807 4753 info.go:109] Failed to get network devices: open /sys/class/net/b7b9aaa4bf46c8f/address: no such file or directory Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.954627 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 05 20:34:51 crc kubenswrapper[4753]: E1005 20:34:51.955016 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d182e9-8e4b-46ce-aa0b-6fd751eefecd" containerName="setup-container" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.955029 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d182e9-8e4b-46ce-aa0b-6fd751eefecd" containerName="setup-container" Oct 05 20:34:51 crc kubenswrapper[4753]: E1005 20:34:51.955041 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d182e9-8e4b-46ce-aa0b-6fd751eefecd" containerName="rabbitmq" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.955049 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d182e9-8e4b-46ce-aa0b-6fd751eefecd" containerName="rabbitmq" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.955255 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="73d182e9-8e4b-46ce-aa0b-6fd751eefecd" containerName="rabbitmq" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.956203 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.964156 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.964665 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/468c6dc5-e196-4084-9211-d2b06253832d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.964695 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/468c6dc5-e196-4084-9211-d2b06253832d-config-data\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.964712 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/468c6dc5-e196-4084-9211-d2b06253832d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.964737 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/468c6dc5-e196-4084-9211-d2b06253832d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.964768 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/468c6dc5-e196-4084-9211-d2b06253832d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.964792 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.964815 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/468c6dc5-e196-4084-9211-d2b06253832d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.964832 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dqzg\" (UniqueName: \"kubernetes.io/projected/468c6dc5-e196-4084-9211-d2b06253832d-kube-api-access-7dqzg\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.964861 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/468c6dc5-e196-4084-9211-d2b06253832d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.964879 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/468c6dc5-e196-4084-9211-d2b06253832d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.964896 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/468c6dc5-e196-4084-9211-d2b06253832d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.965994 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-4zvh4" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.969388 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.969649 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.969766 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.969914 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.970112 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 05 20:34:51 crc kubenswrapper[4753]: I1005 20:34:51.997792 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.006237 4753 scope.go:117] "RemoveContainer" containerID="e838dbccfedc74c736ecacf9ded9f3acd151492c5726744b237bf41a76e69e42" Oct 05 20:34:52 crc kubenswrapper[4753]: E1005 20:34:52.007729 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e838dbccfedc74c736ecacf9ded9f3acd151492c5726744b237bf41a76e69e42\": container with ID starting with e838dbccfedc74c736ecacf9ded9f3acd151492c5726744b237bf41a76e69e42 not found: ID does not exist" containerID="e838dbccfedc74c736ecacf9ded9f3acd151492c5726744b237bf41a76e69e42" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.007765 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e838dbccfedc74c736ecacf9ded9f3acd151492c5726744b237bf41a76e69e42"} err="failed to get container status \"e838dbccfedc74c736ecacf9ded9f3acd151492c5726744b237bf41a76e69e42\": rpc error: code = NotFound desc = could not find container \"e838dbccfedc74c736ecacf9ded9f3acd151492c5726744b237bf41a76e69e42\": container with ID starting with e838dbccfedc74c736ecacf9ded9f3acd151492c5726744b237bf41a76e69e42 not found: ID does not exist" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.007814 4753 scope.go:117] "RemoveContainer" containerID="a730afd36443ffc2b80abbc60e2cca6017825e03868c24ebe4bb442fedc5ae24" Oct 05 20:34:52 crc kubenswrapper[4753]: E1005 20:34:52.009536 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a730afd36443ffc2b80abbc60e2cca6017825e03868c24ebe4bb442fedc5ae24\": container with ID starting with a730afd36443ffc2b80abbc60e2cca6017825e03868c24ebe4bb442fedc5ae24 not found: ID does not exist" containerID="a730afd36443ffc2b80abbc60e2cca6017825e03868c24ebe4bb442fedc5ae24" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.009564 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a730afd36443ffc2b80abbc60e2cca6017825e03868c24ebe4bb442fedc5ae24"} err="failed to get container status \"a730afd36443ffc2b80abbc60e2cca6017825e03868c24ebe4bb442fedc5ae24\": rpc error: code = NotFound desc = could not find container \"a730afd36443ffc2b80abbc60e2cca6017825e03868c24ebe4bb442fedc5ae24\": container with ID starting with a730afd36443ffc2b80abbc60e2cca6017825e03868c24ebe4bb442fedc5ae24 not found: ID does not exist" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.066530 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/468c6dc5-e196-4084-9211-d2b06253832d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.066598 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/468c6dc5-e196-4084-9211-d2b06253832d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.066633 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/468c6dc5-e196-4084-9211-d2b06253832d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.066850 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/468c6dc5-e196-4084-9211-d2b06253832d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.066887 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/468c6dc5-e196-4084-9211-d2b06253832d-config-data\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.066914 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/468c6dc5-e196-4084-9211-d2b06253832d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.066954 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/468c6dc5-e196-4084-9211-d2b06253832d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.067015 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/468c6dc5-e196-4084-9211-d2b06253832d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.067068 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.067126 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/468c6dc5-e196-4084-9211-d2b06253832d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.067180 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dqzg\" (UniqueName: \"kubernetes.io/projected/468c6dc5-e196-4084-9211-d2b06253832d-kube-api-access-7dqzg\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.071780 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.072122 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.073761 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/468c6dc5-e196-4084-9211-d2b06253832d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.086414 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/468c6dc5-e196-4084-9211-d2b06253832d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.086983 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/468c6dc5-e196-4084-9211-d2b06253832d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.087999 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/468c6dc5-e196-4084-9211-d2b06253832d-config-data\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.089564 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/468c6dc5-e196-4084-9211-d2b06253832d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.093305 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/468c6dc5-e196-4084-9211-d2b06253832d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.093316 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/468c6dc5-e196-4084-9211-d2b06253832d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.101020 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/468c6dc5-e196-4084-9211-d2b06253832d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.101570 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/468c6dc5-e196-4084-9211-d2b06253832d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.119223 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dqzg\" (UniqueName: \"kubernetes.io/projected/468c6dc5-e196-4084-9211-d2b06253832d-kube-api-access-7dqzg\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.131700 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"468c6dc5-e196-4084-9211-d2b06253832d\") " pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.167842 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-erlang-cookie\") pod \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.167930 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-erlang-cookie-secret\") pod \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.167974 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-config-data\") pod \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.168002 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.168041 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcjqf\" (UniqueName: \"kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-kube-api-access-kcjqf\") pod \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.168102 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-plugins\") pod \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.168242 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-pod-info\") pod \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.168276 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-confd\") pod \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.168371 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-tls\") pod \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.168400 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-plugins-conf\") pod \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.168434 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-server-conf\") pod \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\" (UID: \"6f77d64e-7c7a-4770-9710-8c4aa767bcfa\") " Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.168697 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6f77d64e-7c7a-4770-9710-8c4aa767bcfa" (UID: "6f77d64e-7c7a-4770-9710-8c4aa767bcfa"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.169090 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6f77d64e-7c7a-4770-9710-8c4aa767bcfa" (UID: "6f77d64e-7c7a-4770-9710-8c4aa767bcfa"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.169094 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.179550 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-kube-api-access-kcjqf" (OuterVolumeSpecName: "kube-api-access-kcjqf") pod "6f77d64e-7c7a-4770-9710-8c4aa767bcfa" (UID: "6f77d64e-7c7a-4770-9710-8c4aa767bcfa"). InnerVolumeSpecName "kube-api-access-kcjqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.179815 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6f77d64e-7c7a-4770-9710-8c4aa767bcfa" (UID: "6f77d64e-7c7a-4770-9710-8c4aa767bcfa"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.179942 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "6f77d64e-7c7a-4770-9710-8c4aa767bcfa" (UID: "6f77d64e-7c7a-4770-9710-8c4aa767bcfa"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.180443 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6f77d64e-7c7a-4770-9710-8c4aa767bcfa" (UID: "6f77d64e-7c7a-4770-9710-8c4aa767bcfa"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.184328 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-pod-info" (OuterVolumeSpecName: "pod-info") pod "6f77d64e-7c7a-4770-9710-8c4aa767bcfa" (UID: "6f77d64e-7c7a-4770-9710-8c4aa767bcfa"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.184385 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6f77d64e-7c7a-4770-9710-8c4aa767bcfa" (UID: "6f77d64e-7c7a-4770-9710-8c4aa767bcfa"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.217580 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-config-data" (OuterVolumeSpecName: "config-data") pod "6f77d64e-7c7a-4770-9710-8c4aa767bcfa" (UID: "6f77d64e-7c7a-4770-9710-8c4aa767bcfa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.270280 4753 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.270312 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.270324 4753 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.270334 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.270356 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.270365 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcjqf\" (UniqueName: \"kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-kube-api-access-kcjqf\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.270373 4753 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-pod-info\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.270381 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.300730 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-server-conf" (OuterVolumeSpecName: "server-conf") pod "6f77d64e-7c7a-4770-9710-8c4aa767bcfa" (UID: "6f77d64e-7c7a-4770-9710-8c4aa767bcfa"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.315456 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.367535 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.375164 4753 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-server-conf\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.375195 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.384541 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6f77d64e-7c7a-4770-9710-8c4aa767bcfa" (UID: "6f77d64e-7c7a-4770-9710-8c4aa767bcfa"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.476921 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f77d64e-7c7a-4770-9710-8c4aa767bcfa-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.820589 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.820581 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f77d64e-7c7a-4770-9710-8c4aa767bcfa","Type":"ContainerDied","Data":"b7b9aaa4bf46c8f8ebd0e0b759f81cb9cf2fe3de6a597a5ca9e34c99552ceb43"} Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.821236 4753 scope.go:117] "RemoveContainer" containerID="527b88c7346a96619d1e411b8d16303db9c661761cf232b2dba5107da80d25a0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.863756 4753 scope.go:117] "RemoveContainer" containerID="5147e49072301e23beb6f2459787fe2e686175dda9c52f176ca87e213655d772" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.867187 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.876915 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.897030 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 05 20:34:52 crc kubenswrapper[4753]: E1005 20:34:52.897586 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f77d64e-7c7a-4770-9710-8c4aa767bcfa" containerName="rabbitmq" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.897653 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f77d64e-7c7a-4770-9710-8c4aa767bcfa" containerName="rabbitmq" Oct 05 20:34:52 crc kubenswrapper[4753]: E1005 20:34:52.897726 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f77d64e-7c7a-4770-9710-8c4aa767bcfa" containerName="setup-container" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.897790 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f77d64e-7c7a-4770-9710-8c4aa767bcfa" containerName="setup-container" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.898006 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f77d64e-7c7a-4770-9710-8c4aa767bcfa" containerName="rabbitmq" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.899033 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.904558 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.904807 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.905106 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.905315 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7sckp" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.905464 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.905628 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.905722 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.912182 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.989289 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.989338 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.989960 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.989996 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.990020 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll9hh\" (UniqueName: \"kubernetes.io/projected/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-kube-api-access-ll9hh\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.990042 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.990076 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.990114 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.990174 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.990190 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:52 crc kubenswrapper[4753]: I1005 20:34:52.990246 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.025338 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.091295 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.091338 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.091370 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.091394 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.091415 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll9hh\" (UniqueName: \"kubernetes.io/projected/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-kube-api-access-ll9hh\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.091433 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.091469 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.091501 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.091535 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.091551 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.091575 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.092317 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.092843 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.093076 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.093420 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.094910 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.096366 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.099795 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.100219 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.105595 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.118600 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.122608 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll9hh\" (UniqueName: \"kubernetes.io/projected/0022b5ba-c84b-4ee1-84a5-8e04d7c4d330-kube-api-access-ll9hh\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.150941 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330\") " pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.221114 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.725459 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 05 20:34:53 crc kubenswrapper[4753]: W1005 20:34:53.725910 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0022b5ba_c84b_4ee1_84a5_8e04d7c4d330.slice/crio-32d9d968136c79ee7ee37e3ccf219b7936b7815f642451d77b75eee630ba2afc WatchSource:0}: Error finding container 32d9d968136c79ee7ee37e3ccf219b7936b7815f642451d77b75eee630ba2afc: Status 404 returned error can't find the container with id 32d9d968136c79ee7ee37e3ccf219b7936b7815f642451d77b75eee630ba2afc Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.848532 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330","Type":"ContainerStarted","Data":"32d9d968136c79ee7ee37e3ccf219b7936b7815f642451d77b75eee630ba2afc"} Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.849615 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"468c6dc5-e196-4084-9211-d2b06253832d","Type":"ContainerStarted","Data":"ea247c20f1f15de009a0ba052bc4e3ddf883bc7fc9cf17cb8cbf0371492f1faf"} Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.867069 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f77d64e-7c7a-4770-9710-8c4aa767bcfa" path="/var/lib/kubelet/pods/6f77d64e-7c7a-4770-9710-8c4aa767bcfa/volumes" Oct 05 20:34:53 crc kubenswrapper[4753]: I1005 20:34:53.868766 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73d182e9-8e4b-46ce-aa0b-6fd751eefecd" path="/var/lib/kubelet/pods/73d182e9-8e4b-46ce-aa0b-6fd751eefecd/volumes" Oct 05 20:34:54 crc kubenswrapper[4753]: I1005 20:34:54.862654 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"468c6dc5-e196-4084-9211-d2b06253832d","Type":"ContainerStarted","Data":"60c353846de38c0b4d78289265291a4929092f89cd0c75e61cb0430aff4cd01f"} Oct 05 20:34:55 crc kubenswrapper[4753]: I1005 20:34:55.878163 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330","Type":"ContainerStarted","Data":"3c7649af84de79b942173a24bf374238f8a2607e4e23c27574a4294979517ba9"} Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.297542 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c56997c9f-cs94f"] Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.299300 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.326285 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.366453 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-ovsdbserver-sb\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.366513 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-config\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.366592 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.366651 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-dns-svc\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.366693 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-ovsdbserver-nb\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.367001 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncr56\" (UniqueName: \"kubernetes.io/projected/6771969b-9209-46ec-b2a3-f425675c4090-kube-api-access-ncr56\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.373101 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c56997c9f-cs94f"] Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.468395 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.469623 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-dns-svc\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.470799 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-ovsdbserver-nb\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.471697 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncr56\" (UniqueName: \"kubernetes.io/projected/6771969b-9209-46ec-b2a3-f425675c4090-kube-api-access-ncr56\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.471910 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-ovsdbserver-sb\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.472059 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-config\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.472829 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-config\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.469420 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.470681 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-dns-svc\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.471600 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-ovsdbserver-nb\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.478046 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-ovsdbserver-sb\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.493749 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncr56\" (UniqueName: \"kubernetes.io/projected/6771969b-9209-46ec-b2a3-f425675c4090-kube-api-access-ncr56\") pod \"dnsmasq-dns-7c56997c9f-cs94f\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:57 crc kubenswrapper[4753]: I1005 20:34:57.631385 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:58 crc kubenswrapper[4753]: I1005 20:34:58.089874 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c56997c9f-cs94f"] Oct 05 20:34:58 crc kubenswrapper[4753]: I1005 20:34:58.904018 4753 generic.go:334] "Generic (PLEG): container finished" podID="6771969b-9209-46ec-b2a3-f425675c4090" containerID="5b4af5940a17d44043f44a8e89778b6148e39a54b2a22299b99da3d0603592a7" exitCode=0 Oct 05 20:34:58 crc kubenswrapper[4753]: I1005 20:34:58.904117 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" event={"ID":"6771969b-9209-46ec-b2a3-f425675c4090","Type":"ContainerDied","Data":"5b4af5940a17d44043f44a8e89778b6148e39a54b2a22299b99da3d0603592a7"} Oct 05 20:34:58 crc kubenswrapper[4753]: I1005 20:34:58.904271 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" event={"ID":"6771969b-9209-46ec-b2a3-f425675c4090","Type":"ContainerStarted","Data":"d285edc34abc0ad7d4c3d4d5f70824f085ceebc424b923d9b5e12695acf012fe"} Oct 05 20:34:59 crc kubenswrapper[4753]: I1005 20:34:59.917297 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" event={"ID":"6771969b-9209-46ec-b2a3-f425675c4090","Type":"ContainerStarted","Data":"e6fd4068300439a00111c32773c5a0dd34d90b3965f22da8884038dbca88664f"} Oct 05 20:34:59 crc kubenswrapper[4753]: I1005 20:34:59.918406 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:34:59 crc kubenswrapper[4753]: I1005 20:34:59.944319 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" podStartSLOduration=2.944293035 podStartE2EDuration="2.944293035s" podCreationTimestamp="2025-10-05 20:34:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:34:59.941421845 +0000 UTC m=+1208.789750117" watchObservedRunningTime="2025-10-05 20:34:59.944293035 +0000 UTC m=+1208.792621287" Oct 05 20:35:01 crc kubenswrapper[4753]: E1005 20:35:01.919026 4753 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d182e9_8e4b_46ce_aa0b_6fd751eefecd.slice/crio-f10e1725cc723c29ef4bd7b39b13680e39522e023e3435dbc375a368b642b94c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d182e9_8e4b_46ce_aa0b_6fd751eefecd.slice\": RecentStats: unable to find data in memory cache]" Oct 05 20:35:04 crc kubenswrapper[4753]: I1005 20:35:04.490116 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:35:04 crc kubenswrapper[4753]: I1005 20:35:04.490838 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:35:04 crc kubenswrapper[4753]: I1005 20:35:04.490900 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:35:04 crc kubenswrapper[4753]: I1005 20:35:04.491988 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6035d1f392035c4d85f80611de9c6434a6132b34d7431e03335a2036e9d1df2f"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 20:35:04 crc kubenswrapper[4753]: I1005 20:35:04.492076 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://6035d1f392035c4d85f80611de9c6434a6132b34d7431e03335a2036e9d1df2f" gracePeriod=600 Oct 05 20:35:04 crc kubenswrapper[4753]: I1005 20:35:04.967873 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="6035d1f392035c4d85f80611de9c6434a6132b34d7431e03335a2036e9d1df2f" exitCode=0 Oct 05 20:35:04 crc kubenswrapper[4753]: I1005 20:35:04.967931 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"6035d1f392035c4d85f80611de9c6434a6132b34d7431e03335a2036e9d1df2f"} Oct 05 20:35:04 crc kubenswrapper[4753]: I1005 20:35:04.967972 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"9030d6a687aa30375abc82b21b6ddaee627312bc9d654f1ab51991af41da5fd6"} Oct 05 20:35:04 crc kubenswrapper[4753]: I1005 20:35:04.968006 4753 scope.go:117] "RemoveContainer" containerID="4bd1799dc562bea716f28381e5b80e668ae3afe82aa4afc891b52d8b6b3b6337" Oct 05 20:35:07 crc kubenswrapper[4753]: I1005 20:35:07.632260 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:35:07 crc kubenswrapper[4753]: I1005 20:35:07.698566 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7759979f65-h9vzh"] Oct 05 20:35:07 crc kubenswrapper[4753]: I1005 20:35:07.699093 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7759979f65-h9vzh" podUID="4b6d8549-ef82-43f6-bc04-97565906e391" containerName="dnsmasq-dns" containerID="cri-o://241813e054282349133be92070c123e288e40424449341e84e28a6e89126f78f" gracePeriod=10 Oct 05 20:35:07 crc kubenswrapper[4753]: I1005 20:35:07.941789 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bccd47bfc-j5m69"] Oct 05 20:35:07 crc kubenswrapper[4753]: I1005 20:35:07.945027 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:07 crc kubenswrapper[4753]: I1005 20:35:07.961946 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bccd47bfc-j5m69"] Oct 05 20:35:07 crc kubenswrapper[4753]: I1005 20:35:07.970192 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nszgm\" (UniqueName: \"kubernetes.io/projected/670bf0ed-f184-4241-b9e7-989781ea4112-kube-api-access-nszgm\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:07 crc kubenswrapper[4753]: I1005 20:35:07.970316 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-config\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:07 crc kubenswrapper[4753]: I1005 20:35:07.970357 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-openstack-edpm-ipam\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:07 crc kubenswrapper[4753]: I1005 20:35:07.970386 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-ovsdbserver-nb\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:07 crc kubenswrapper[4753]: I1005 20:35:07.970424 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-dns-svc\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:07 crc kubenswrapper[4753]: I1005 20:35:07.970459 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-ovsdbserver-sb\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.000428 4753 generic.go:334] "Generic (PLEG): container finished" podID="4b6d8549-ef82-43f6-bc04-97565906e391" containerID="241813e054282349133be92070c123e288e40424449341e84e28a6e89126f78f" exitCode=0 Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.000477 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7759979f65-h9vzh" event={"ID":"4b6d8549-ef82-43f6-bc04-97565906e391","Type":"ContainerDied","Data":"241813e054282349133be92070c123e288e40424449341e84e28a6e89126f78f"} Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.071800 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-config\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.071862 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-openstack-edpm-ipam\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.071899 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-ovsdbserver-nb\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.071949 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-dns-svc\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.071990 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-ovsdbserver-sb\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.072038 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nszgm\" (UniqueName: \"kubernetes.io/projected/670bf0ed-f184-4241-b9e7-989781ea4112-kube-api-access-nszgm\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.076616 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-config\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.077756 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-ovsdbserver-nb\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.078242 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-openstack-edpm-ipam\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.078430 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-ovsdbserver-sb\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.078612 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-dns-svc\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.093991 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nszgm\" (UniqueName: \"kubernetes.io/projected/670bf0ed-f184-4241-b9e7-989781ea4112-kube-api-access-nszgm\") pod \"dnsmasq-dns-5bccd47bfc-j5m69\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.147796 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.174418 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-config\") pod \"4b6d8549-ef82-43f6-bc04-97565906e391\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.174483 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-ovsdbserver-sb\") pod \"4b6d8549-ef82-43f6-bc04-97565906e391\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.174578 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-ovsdbserver-nb\") pod \"4b6d8549-ef82-43f6-bc04-97565906e391\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.174728 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km76q\" (UniqueName: \"kubernetes.io/projected/4b6d8549-ef82-43f6-bc04-97565906e391-kube-api-access-km76q\") pod \"4b6d8549-ef82-43f6-bc04-97565906e391\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.174754 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-dns-svc\") pod \"4b6d8549-ef82-43f6-bc04-97565906e391\" (UID: \"4b6d8549-ef82-43f6-bc04-97565906e391\") " Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.180775 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b6d8549-ef82-43f6-bc04-97565906e391-kube-api-access-km76q" (OuterVolumeSpecName: "kube-api-access-km76q") pod "4b6d8549-ef82-43f6-bc04-97565906e391" (UID: "4b6d8549-ef82-43f6-bc04-97565906e391"). InnerVolumeSpecName "kube-api-access-km76q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.224750 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-config" (OuterVolumeSpecName: "config") pod "4b6d8549-ef82-43f6-bc04-97565906e391" (UID: "4b6d8549-ef82-43f6-bc04-97565906e391"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.232216 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4b6d8549-ef82-43f6-bc04-97565906e391" (UID: "4b6d8549-ef82-43f6-bc04-97565906e391"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.237112 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4b6d8549-ef82-43f6-bc04-97565906e391" (UID: "4b6d8549-ef82-43f6-bc04-97565906e391"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.254043 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4b6d8549-ef82-43f6-bc04-97565906e391" (UID: "4b6d8549-ef82-43f6-bc04-97565906e391"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.275164 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.276769 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km76q\" (UniqueName: \"kubernetes.io/projected/4b6d8549-ef82-43f6-bc04-97565906e391-kube-api-access-km76q\") on node \"crc\" DevicePath \"\"" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.276862 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.276919 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.276978 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.277039 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b6d8549-ef82-43f6-bc04-97565906e391-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 05 20:35:08 crc kubenswrapper[4753]: I1005 20:35:08.775395 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bccd47bfc-j5m69"] Oct 05 20:35:08 crc kubenswrapper[4753]: W1005 20:35:08.778298 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod670bf0ed_f184_4241_b9e7_989781ea4112.slice/crio-0102cbc1518c5bddb1edb280ba7117efdb96f05b02d74317ac972fa37d918d94 WatchSource:0}: Error finding container 0102cbc1518c5bddb1edb280ba7117efdb96f05b02d74317ac972fa37d918d94: Status 404 returned error can't find the container with id 0102cbc1518c5bddb1edb280ba7117efdb96f05b02d74317ac972fa37d918d94 Oct 05 20:35:09 crc kubenswrapper[4753]: I1005 20:35:09.010461 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" event={"ID":"670bf0ed-f184-4241-b9e7-989781ea4112","Type":"ContainerStarted","Data":"286af67d9e12ad2e5ed379526fbc273e45ffbb62d17a76dbcb0aa6602454d19b"} Oct 05 20:35:09 crc kubenswrapper[4753]: I1005 20:35:09.010777 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" event={"ID":"670bf0ed-f184-4241-b9e7-989781ea4112","Type":"ContainerStarted","Data":"0102cbc1518c5bddb1edb280ba7117efdb96f05b02d74317ac972fa37d918d94"} Oct 05 20:35:09 crc kubenswrapper[4753]: I1005 20:35:09.014028 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7759979f65-h9vzh" event={"ID":"4b6d8549-ef82-43f6-bc04-97565906e391","Type":"ContainerDied","Data":"8380009d62996cd7a1a4e6f7afb4332688c117ef4e4000d82f091363fcd73c7a"} Oct 05 20:35:09 crc kubenswrapper[4753]: I1005 20:35:09.014071 4753 scope.go:117] "RemoveContainer" containerID="241813e054282349133be92070c123e288e40424449341e84e28a6e89126f78f" Oct 05 20:35:09 crc kubenswrapper[4753]: I1005 20:35:09.014271 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7759979f65-h9vzh" Oct 05 20:35:09 crc kubenswrapper[4753]: I1005 20:35:09.062591 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7759979f65-h9vzh"] Oct 05 20:35:09 crc kubenswrapper[4753]: I1005 20:35:09.074830 4753 scope.go:117] "RemoveContainer" containerID="55ec1bf876d6b75234d1facc4d30770dd47bd245958787b36f755fda13e3935d" Oct 05 20:35:09 crc kubenswrapper[4753]: I1005 20:35:09.080106 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7759979f65-h9vzh"] Oct 05 20:35:09 crc kubenswrapper[4753]: I1005 20:35:09.862664 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b6d8549-ef82-43f6-bc04-97565906e391" path="/var/lib/kubelet/pods/4b6d8549-ef82-43f6-bc04-97565906e391/volumes" Oct 05 20:35:10 crc kubenswrapper[4753]: I1005 20:35:10.024072 4753 generic.go:334] "Generic (PLEG): container finished" podID="670bf0ed-f184-4241-b9e7-989781ea4112" containerID="286af67d9e12ad2e5ed379526fbc273e45ffbb62d17a76dbcb0aa6602454d19b" exitCode=0 Oct 05 20:35:10 crc kubenswrapper[4753]: I1005 20:35:10.024122 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" event={"ID":"670bf0ed-f184-4241-b9e7-989781ea4112","Type":"ContainerDied","Data":"286af67d9e12ad2e5ed379526fbc273e45ffbb62d17a76dbcb0aa6602454d19b"} Oct 05 20:35:11 crc kubenswrapper[4753]: I1005 20:35:11.037802 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" event={"ID":"670bf0ed-f184-4241-b9e7-989781ea4112","Type":"ContainerStarted","Data":"f7d41b4d7a7921199668ec70cc99775d70275cb49584d84e987bf554ec958c75"} Oct 05 20:35:11 crc kubenswrapper[4753]: I1005 20:35:11.038343 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:11 crc kubenswrapper[4753]: I1005 20:35:11.068317 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" podStartSLOduration=4.068298147 podStartE2EDuration="4.068298147s" podCreationTimestamp="2025-10-05 20:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:35:11.066415718 +0000 UTC m=+1219.914743960" watchObservedRunningTime="2025-10-05 20:35:11.068298147 +0000 UTC m=+1219.916626399" Oct 05 20:35:12 crc kubenswrapper[4753]: E1005 20:35:12.152194 4753 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d182e9_8e4b_46ce_aa0b_6fd751eefecd.slice/crio-f10e1725cc723c29ef4bd7b39b13680e39522e023e3435dbc375a368b642b94c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d182e9_8e4b_46ce_aa0b_6fd751eefecd.slice\": RecentStats: unable to find data in memory cache]" Oct 05 20:35:18 crc kubenswrapper[4753]: I1005 20:35:18.277214 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 20:35:18 crc kubenswrapper[4753]: I1005 20:35:18.354497 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c56997c9f-cs94f"] Oct 05 20:35:18 crc kubenswrapper[4753]: I1005 20:35:18.354818 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" podUID="6771969b-9209-46ec-b2a3-f425675c4090" containerName="dnsmasq-dns" containerID="cri-o://e6fd4068300439a00111c32773c5a0dd34d90b3965f22da8884038dbca88664f" gracePeriod=10 Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.150256 4753 generic.go:334] "Generic (PLEG): container finished" podID="6771969b-9209-46ec-b2a3-f425675c4090" containerID="e6fd4068300439a00111c32773c5a0dd34d90b3965f22da8884038dbca88664f" exitCode=0 Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.150454 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" event={"ID":"6771969b-9209-46ec-b2a3-f425675c4090","Type":"ContainerDied","Data":"e6fd4068300439a00111c32773c5a0dd34d90b3965f22da8884038dbca88664f"} Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.358325 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.506159 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-dns-svc\") pod \"6771969b-9209-46ec-b2a3-f425675c4090\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.506295 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-config\") pod \"6771969b-9209-46ec-b2a3-f425675c4090\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.506381 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-ovsdbserver-sb\") pod \"6771969b-9209-46ec-b2a3-f425675c4090\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.506417 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-openstack-edpm-ipam\") pod \"6771969b-9209-46ec-b2a3-f425675c4090\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.506481 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncr56\" (UniqueName: \"kubernetes.io/projected/6771969b-9209-46ec-b2a3-f425675c4090-kube-api-access-ncr56\") pod \"6771969b-9209-46ec-b2a3-f425675c4090\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.506520 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-ovsdbserver-nb\") pod \"6771969b-9209-46ec-b2a3-f425675c4090\" (UID: \"6771969b-9209-46ec-b2a3-f425675c4090\") " Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.513399 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6771969b-9209-46ec-b2a3-f425675c4090-kube-api-access-ncr56" (OuterVolumeSpecName: "kube-api-access-ncr56") pod "6771969b-9209-46ec-b2a3-f425675c4090" (UID: "6771969b-9209-46ec-b2a3-f425675c4090"). InnerVolumeSpecName "kube-api-access-ncr56". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.559130 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-config" (OuterVolumeSpecName: "config") pod "6771969b-9209-46ec-b2a3-f425675c4090" (UID: "6771969b-9209-46ec-b2a3-f425675c4090"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.569450 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6771969b-9209-46ec-b2a3-f425675c4090" (UID: "6771969b-9209-46ec-b2a3-f425675c4090"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.571686 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6771969b-9209-46ec-b2a3-f425675c4090" (UID: "6771969b-9209-46ec-b2a3-f425675c4090"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.578477 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6771969b-9209-46ec-b2a3-f425675c4090" (UID: "6771969b-9209-46ec-b2a3-f425675c4090"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.595605 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "6771969b-9209-46ec-b2a3-f425675c4090" (UID: "6771969b-9209-46ec-b2a3-f425675c4090"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.608638 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncr56\" (UniqueName: \"kubernetes.io/projected/6771969b-9209-46ec-b2a3-f425675c4090-kube-api-access-ncr56\") on node \"crc\" DevicePath \"\"" Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.608680 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.608690 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.608699 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-config\") on node \"crc\" DevicePath \"\"" Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.608707 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 05 20:35:19 crc kubenswrapper[4753]: I1005 20:35:19.608716 4753 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6771969b-9209-46ec-b2a3-f425675c4090-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 05 20:35:20 crc kubenswrapper[4753]: I1005 20:35:20.161921 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" event={"ID":"6771969b-9209-46ec-b2a3-f425675c4090","Type":"ContainerDied","Data":"d285edc34abc0ad7d4c3d4d5f70824f085ceebc424b923d9b5e12695acf012fe"} Oct 05 20:35:20 crc kubenswrapper[4753]: I1005 20:35:20.161981 4753 scope.go:117] "RemoveContainer" containerID="e6fd4068300439a00111c32773c5a0dd34d90b3965f22da8884038dbca88664f" Oct 05 20:35:20 crc kubenswrapper[4753]: I1005 20:35:20.162009 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c56997c9f-cs94f" Oct 05 20:35:20 crc kubenswrapper[4753]: I1005 20:35:20.188833 4753 scope.go:117] "RemoveContainer" containerID="5b4af5940a17d44043f44a8e89778b6148e39a54b2a22299b99da3d0603592a7" Oct 05 20:35:20 crc kubenswrapper[4753]: I1005 20:35:20.189232 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c56997c9f-cs94f"] Oct 05 20:35:20 crc kubenswrapper[4753]: I1005 20:35:20.197124 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c56997c9f-cs94f"] Oct 05 20:35:21 crc kubenswrapper[4753]: I1005 20:35:21.862977 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6771969b-9209-46ec-b2a3-f425675c4090" path="/var/lib/kubelet/pods/6771969b-9209-46ec-b2a3-f425675c4090/volumes" Oct 05 20:35:22 crc kubenswrapper[4753]: E1005 20:35:22.409645 4753 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d182e9_8e4b_46ce_aa0b_6fd751eefecd.slice/crio-f10e1725cc723c29ef4bd7b39b13680e39522e023e3435dbc375a368b642b94c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d182e9_8e4b_46ce_aa0b_6fd751eefecd.slice\": RecentStats: unable to find data in memory cache]" Oct 05 20:35:27 crc kubenswrapper[4753]: I1005 20:35:27.254422 4753 generic.go:334] "Generic (PLEG): container finished" podID="468c6dc5-e196-4084-9211-d2b06253832d" containerID="60c353846de38c0b4d78289265291a4929092f89cd0c75e61cb0430aff4cd01f" exitCode=0 Oct 05 20:35:27 crc kubenswrapper[4753]: I1005 20:35:27.254811 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"468c6dc5-e196-4084-9211-d2b06253832d","Type":"ContainerDied","Data":"60c353846de38c0b4d78289265291a4929092f89cd0c75e61cb0430aff4cd01f"} Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.266481 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"468c6dc5-e196-4084-9211-d2b06253832d","Type":"ContainerStarted","Data":"091faeae6ae122ca064d2294b05550c6da701127914f1b7409c3860411343a8f"} Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.267094 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.269645 4753 generic.go:334] "Generic (PLEG): container finished" podID="0022b5ba-c84b-4ee1-84a5-8e04d7c4d330" containerID="3c7649af84de79b942173a24bf374238f8a2607e4e23c27574a4294979517ba9" exitCode=0 Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.269713 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330","Type":"ContainerDied","Data":"3c7649af84de79b942173a24bf374238f8a2607e4e23c27574a4294979517ba9"} Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.304018 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.303414006 podStartE2EDuration="37.303414006s" podCreationTimestamp="2025-10-05 20:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:35:28.2903426 +0000 UTC m=+1237.138670832" watchObservedRunningTime="2025-10-05 20:35:28.303414006 +0000 UTC m=+1237.151742238" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.522454 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24"] Oct 05 20:35:28 crc kubenswrapper[4753]: E1005 20:35:28.522890 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b6d8549-ef82-43f6-bc04-97565906e391" containerName="init" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.522908 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6d8549-ef82-43f6-bc04-97565906e391" containerName="init" Oct 05 20:35:28 crc kubenswrapper[4753]: E1005 20:35:28.522932 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6771969b-9209-46ec-b2a3-f425675c4090" containerName="dnsmasq-dns" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.522941 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6771969b-9209-46ec-b2a3-f425675c4090" containerName="dnsmasq-dns" Oct 05 20:35:28 crc kubenswrapper[4753]: E1005 20:35:28.522959 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6771969b-9209-46ec-b2a3-f425675c4090" containerName="init" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.522967 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6771969b-9209-46ec-b2a3-f425675c4090" containerName="init" Oct 05 20:35:28 crc kubenswrapper[4753]: E1005 20:35:28.522992 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b6d8549-ef82-43f6-bc04-97565906e391" containerName="dnsmasq-dns" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.523000 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6d8549-ef82-43f6-bc04-97565906e391" containerName="dnsmasq-dns" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.523300 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b6d8549-ef82-43f6-bc04-97565906e391" containerName="dnsmasq-dns" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.523380 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6771969b-9209-46ec-b2a3-f425675c4090" containerName="dnsmasq-dns" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.524055 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.529600 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.529764 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.529879 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.529992 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.541103 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24"] Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.680159 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24\" (UID: \"e9296ffc-298c-49ec-a282-e8906b12ef70\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.680493 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbjnn\" (UniqueName: \"kubernetes.io/projected/e9296ffc-298c-49ec-a282-e8906b12ef70-kube-api-access-lbjnn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24\" (UID: \"e9296ffc-298c-49ec-a282-e8906b12ef70\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.680907 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24\" (UID: \"e9296ffc-298c-49ec-a282-e8906b12ef70\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.680946 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24\" (UID: \"e9296ffc-298c-49ec-a282-e8906b12ef70\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.782442 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbjnn\" (UniqueName: \"kubernetes.io/projected/e9296ffc-298c-49ec-a282-e8906b12ef70-kube-api-access-lbjnn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24\" (UID: \"e9296ffc-298c-49ec-a282-e8906b12ef70\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.782896 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24\" (UID: \"e9296ffc-298c-49ec-a282-e8906b12ef70\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.782985 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24\" (UID: \"e9296ffc-298c-49ec-a282-e8906b12ef70\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.783100 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24\" (UID: \"e9296ffc-298c-49ec-a282-e8906b12ef70\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.788377 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24\" (UID: \"e9296ffc-298c-49ec-a282-e8906b12ef70\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.788521 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24\" (UID: \"e9296ffc-298c-49ec-a282-e8906b12ef70\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.790866 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24\" (UID: \"e9296ffc-298c-49ec-a282-e8906b12ef70\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.802799 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbjnn\" (UniqueName: \"kubernetes.io/projected/e9296ffc-298c-49ec-a282-e8906b12ef70-kube-api-access-lbjnn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24\" (UID: \"e9296ffc-298c-49ec-a282-e8906b12ef70\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" Oct 05 20:35:28 crc kubenswrapper[4753]: I1005 20:35:28.865046 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" Oct 05 20:35:29 crc kubenswrapper[4753]: I1005 20:35:29.279238 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0022b5ba-c84b-4ee1-84a5-8e04d7c4d330","Type":"ContainerStarted","Data":"3a55bdd2da7c1b9d7e52848caa780188d4802606047bd47abeea4e36ce2e992a"} Oct 05 20:35:29 crc kubenswrapper[4753]: I1005 20:35:29.280190 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:35:29 crc kubenswrapper[4753]: I1005 20:35:29.307731 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.307702262 podStartE2EDuration="37.307702262s" podCreationTimestamp="2025-10-05 20:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 20:35:29.306273367 +0000 UTC m=+1238.154601599" watchObservedRunningTime="2025-10-05 20:35:29.307702262 +0000 UTC m=+1238.156030494" Oct 05 20:35:29 crc kubenswrapper[4753]: I1005 20:35:29.440172 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24"] Oct 05 20:35:29 crc kubenswrapper[4753]: I1005 20:35:29.467091 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 05 20:35:30 crc kubenswrapper[4753]: I1005 20:35:30.285865 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" event={"ID":"e9296ffc-298c-49ec-a282-e8906b12ef70","Type":"ContainerStarted","Data":"20272bf2da0a006b9516135281db9067d4bb9ad95aa8f78f0294124dbce2026a"} Oct 05 20:35:32 crc kubenswrapper[4753]: E1005 20:35:32.645677 4753 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d182e9_8e4b_46ce_aa0b_6fd751eefecd.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d182e9_8e4b_46ce_aa0b_6fd751eefecd.slice/crio-f10e1725cc723c29ef4bd7b39b13680e39522e023e3435dbc375a368b642b94c\": RecentStats: unable to find data in memory cache]" Oct 05 20:35:40 crc kubenswrapper[4753]: I1005 20:35:40.401186 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" event={"ID":"e9296ffc-298c-49ec-a282-e8906b12ef70","Type":"ContainerStarted","Data":"6c6b8c27a07834f62a910221ea8ffd3da97777898b0a6bd915fb5d2efffd5d68"} Oct 05 20:35:42 crc kubenswrapper[4753]: I1005 20:35:42.371376 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 05 20:35:42 crc kubenswrapper[4753]: I1005 20:35:42.395868 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" podStartSLOduration=3.683306997 podStartE2EDuration="14.395848279s" podCreationTimestamp="2025-10-05 20:35:28 +0000 UTC" firstStartedPulling="2025-10-05 20:35:29.466901469 +0000 UTC m=+1238.315229701" lastFinishedPulling="2025-10-05 20:35:40.179442751 +0000 UTC m=+1249.027770983" observedRunningTime="2025-10-05 20:35:40.420416995 +0000 UTC m=+1249.268745247" watchObservedRunningTime="2025-10-05 20:35:42.395848279 +0000 UTC m=+1251.244176511" Oct 05 20:35:42 crc kubenswrapper[4753]: E1005 20:35:42.893922 4753 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d182e9_8e4b_46ce_aa0b_6fd751eefecd.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d182e9_8e4b_46ce_aa0b_6fd751eefecd.slice/crio-f10e1725cc723c29ef4bd7b39b13680e39522e023e3435dbc375a368b642b94c\": RecentStats: unable to find data in memory cache]" Oct 05 20:35:43 crc kubenswrapper[4753]: I1005 20:35:43.223355 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 05 20:35:52 crc kubenswrapper[4753]: I1005 20:35:52.500615 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" event={"ID":"e9296ffc-298c-49ec-a282-e8906b12ef70","Type":"ContainerDied","Data":"6c6b8c27a07834f62a910221ea8ffd3da97777898b0a6bd915fb5d2efffd5d68"} Oct 05 20:35:52 crc kubenswrapper[4753]: I1005 20:35:52.500562 4753 generic.go:334] "Generic (PLEG): container finished" podID="e9296ffc-298c-49ec-a282-e8906b12ef70" containerID="6c6b8c27a07834f62a910221ea8ffd3da97777898b0a6bd915fb5d2efffd5d68" exitCode=0 Oct 05 20:35:53 crc kubenswrapper[4753]: I1005 20:35:53.896523 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" Oct 05 20:35:53 crc kubenswrapper[4753]: I1005 20:35:53.966085 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbjnn\" (UniqueName: \"kubernetes.io/projected/e9296ffc-298c-49ec-a282-e8906b12ef70-kube-api-access-lbjnn\") pod \"e9296ffc-298c-49ec-a282-e8906b12ef70\" (UID: \"e9296ffc-298c-49ec-a282-e8906b12ef70\") " Oct 05 20:35:53 crc kubenswrapper[4753]: I1005 20:35:53.966196 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-inventory\") pod \"e9296ffc-298c-49ec-a282-e8906b12ef70\" (UID: \"e9296ffc-298c-49ec-a282-e8906b12ef70\") " Oct 05 20:35:53 crc kubenswrapper[4753]: I1005 20:35:53.966219 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-repo-setup-combined-ca-bundle\") pod \"e9296ffc-298c-49ec-a282-e8906b12ef70\" (UID: \"e9296ffc-298c-49ec-a282-e8906b12ef70\") " Oct 05 20:35:53 crc kubenswrapper[4753]: I1005 20:35:53.966269 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-ssh-key\") pod \"e9296ffc-298c-49ec-a282-e8906b12ef70\" (UID: \"e9296ffc-298c-49ec-a282-e8906b12ef70\") " Oct 05 20:35:53 crc kubenswrapper[4753]: I1005 20:35:53.987373 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e9296ffc-298c-49ec-a282-e8906b12ef70" (UID: "e9296ffc-298c-49ec-a282-e8906b12ef70"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:35:53 crc kubenswrapper[4753]: I1005 20:35:53.992774 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9296ffc-298c-49ec-a282-e8906b12ef70-kube-api-access-lbjnn" (OuterVolumeSpecName: "kube-api-access-lbjnn") pod "e9296ffc-298c-49ec-a282-e8906b12ef70" (UID: "e9296ffc-298c-49ec-a282-e8906b12ef70"). InnerVolumeSpecName "kube-api-access-lbjnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.007231 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e9296ffc-298c-49ec-a282-e8906b12ef70" (UID: "e9296ffc-298c-49ec-a282-e8906b12ef70"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.013912 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-inventory" (OuterVolumeSpecName: "inventory") pod "e9296ffc-298c-49ec-a282-e8906b12ef70" (UID: "e9296ffc-298c-49ec-a282-e8906b12ef70"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.068467 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.068504 4753 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.068520 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9296ffc-298c-49ec-a282-e8906b12ef70-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.068531 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbjnn\" (UniqueName: \"kubernetes.io/projected/e9296ffc-298c-49ec-a282-e8906b12ef70-kube-api-access-lbjnn\") on node \"crc\" DevicePath \"\"" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.523364 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" event={"ID":"e9296ffc-298c-49ec-a282-e8906b12ef70","Type":"ContainerDied","Data":"20272bf2da0a006b9516135281db9067d4bb9ad95aa8f78f0294124dbce2026a"} Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.523404 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20272bf2da0a006b9516135281db9067d4bb9ad95aa8f78f0294124dbce2026a" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.523408 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.599754 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf"] Oct 05 20:35:54 crc kubenswrapper[4753]: E1005 20:35:54.600098 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9296ffc-298c-49ec-a282-e8906b12ef70" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.600114 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9296ffc-298c-49ec-a282-e8906b12ef70" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.600328 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9296ffc-298c-49ec-a282-e8906b12ef70" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.601055 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.602352 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.602740 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.603677 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.603957 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.611202 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf"] Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.677195 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf\" (UID: \"03feab36-54ee-4968-a9d9-841cbd059c48\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.677653 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8cfj\" (UniqueName: \"kubernetes.io/projected/03feab36-54ee-4968-a9d9-841cbd059c48-kube-api-access-r8cfj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf\" (UID: \"03feab36-54ee-4968-a9d9-841cbd059c48\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.677707 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf\" (UID: \"03feab36-54ee-4968-a9d9-841cbd059c48\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.677736 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf\" (UID: \"03feab36-54ee-4968-a9d9-841cbd059c48\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.779292 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf\" (UID: \"03feab36-54ee-4968-a9d9-841cbd059c48\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.779409 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8cfj\" (UniqueName: \"kubernetes.io/projected/03feab36-54ee-4968-a9d9-841cbd059c48-kube-api-access-r8cfj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf\" (UID: \"03feab36-54ee-4968-a9d9-841cbd059c48\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.779456 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf\" (UID: \"03feab36-54ee-4968-a9d9-841cbd059c48\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.779482 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf\" (UID: \"03feab36-54ee-4968-a9d9-841cbd059c48\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.785525 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf\" (UID: \"03feab36-54ee-4968-a9d9-841cbd059c48\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.786018 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf\" (UID: \"03feab36-54ee-4968-a9d9-841cbd059c48\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.786643 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf\" (UID: \"03feab36-54ee-4968-a9d9-841cbd059c48\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.797990 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8cfj\" (UniqueName: \"kubernetes.io/projected/03feab36-54ee-4968-a9d9-841cbd059c48-kube-api-access-r8cfj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf\" (UID: \"03feab36-54ee-4968-a9d9-841cbd059c48\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" Oct 05 20:35:54 crc kubenswrapper[4753]: I1005 20:35:54.921053 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" Oct 05 20:35:55 crc kubenswrapper[4753]: W1005 20:35:55.412573 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03feab36_54ee_4968_a9d9_841cbd059c48.slice/crio-2d9e090eb20ff56e52c73e28a142ff8e77d22d12088c2ff92ff74c649a600a15 WatchSource:0}: Error finding container 2d9e090eb20ff56e52c73e28a142ff8e77d22d12088c2ff92ff74c649a600a15: Status 404 returned error can't find the container with id 2d9e090eb20ff56e52c73e28a142ff8e77d22d12088c2ff92ff74c649a600a15 Oct 05 20:35:55 crc kubenswrapper[4753]: I1005 20:35:55.424626 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf"] Oct 05 20:35:55 crc kubenswrapper[4753]: I1005 20:35:55.533608 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" event={"ID":"03feab36-54ee-4968-a9d9-841cbd059c48","Type":"ContainerStarted","Data":"2d9e090eb20ff56e52c73e28a142ff8e77d22d12088c2ff92ff74c649a600a15"} Oct 05 20:35:56 crc kubenswrapper[4753]: I1005 20:35:56.542844 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" event={"ID":"03feab36-54ee-4968-a9d9-841cbd059c48","Type":"ContainerStarted","Data":"99084cf66eb920dccd49c79c0cced862601fb3880ba396b1d3cc4ecc013c55c7"} Oct 05 20:35:56 crc kubenswrapper[4753]: I1005 20:35:56.566582 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" podStartSLOduration=2.168371951 podStartE2EDuration="2.56655841s" podCreationTimestamp="2025-10-05 20:35:54 +0000 UTC" firstStartedPulling="2025-10-05 20:35:55.417532105 +0000 UTC m=+1264.265860337" lastFinishedPulling="2025-10-05 20:35:55.815718534 +0000 UTC m=+1264.664046796" observedRunningTime="2025-10-05 20:35:56.555725534 +0000 UTC m=+1265.404053776" watchObservedRunningTime="2025-10-05 20:35:56.56655841 +0000 UTC m=+1265.414886662" Oct 05 20:37:01 crc kubenswrapper[4753]: I1005 20:37:01.501629 4753 scope.go:117] "RemoveContainer" containerID="b6cb35863e1bfd69e973067627d866681726e82ae091c618cd255b4a866380e2" Oct 05 20:37:01 crc kubenswrapper[4753]: I1005 20:37:01.555495 4753 scope.go:117] "RemoveContainer" containerID="ffa3c0b3f8d357022658924fbde7c96b5af854dff028dded3bb2c558e162c90c" Oct 05 20:37:04 crc kubenswrapper[4753]: I1005 20:37:04.490654 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:37:04 crc kubenswrapper[4753]: I1005 20:37:04.491042 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:37:34 crc kubenswrapper[4753]: I1005 20:37:34.490213 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:37:34 crc kubenswrapper[4753]: I1005 20:37:34.490999 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:37:52 crc kubenswrapper[4753]: I1005 20:37:52.879328 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q2jnt"] Oct 05 20:37:52 crc kubenswrapper[4753]: I1005 20:37:52.883976 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:37:52 crc kubenswrapper[4753]: I1005 20:37:52.904764 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2jnt"] Oct 05 20:37:52 crc kubenswrapper[4753]: I1005 20:37:52.937809 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-catalog-content\") pod \"redhat-marketplace-q2jnt\" (UID: \"692e13de-cdf9-45c0-bdb6-f94485ca4d5d\") " pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:37:52 crc kubenswrapper[4753]: I1005 20:37:52.938000 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v4fh\" (UniqueName: \"kubernetes.io/projected/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-kube-api-access-4v4fh\") pod \"redhat-marketplace-q2jnt\" (UID: \"692e13de-cdf9-45c0-bdb6-f94485ca4d5d\") " pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:37:52 crc kubenswrapper[4753]: I1005 20:37:52.938131 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-utilities\") pod \"redhat-marketplace-q2jnt\" (UID: \"692e13de-cdf9-45c0-bdb6-f94485ca4d5d\") " pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:37:53 crc kubenswrapper[4753]: I1005 20:37:53.040186 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v4fh\" (UniqueName: \"kubernetes.io/projected/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-kube-api-access-4v4fh\") pod \"redhat-marketplace-q2jnt\" (UID: \"692e13de-cdf9-45c0-bdb6-f94485ca4d5d\") " pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:37:53 crc kubenswrapper[4753]: I1005 20:37:53.040370 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-utilities\") pod \"redhat-marketplace-q2jnt\" (UID: \"692e13de-cdf9-45c0-bdb6-f94485ca4d5d\") " pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:37:53 crc kubenswrapper[4753]: I1005 20:37:53.040440 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-catalog-content\") pod \"redhat-marketplace-q2jnt\" (UID: \"692e13de-cdf9-45c0-bdb6-f94485ca4d5d\") " pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:37:53 crc kubenswrapper[4753]: I1005 20:37:53.041263 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-catalog-content\") pod \"redhat-marketplace-q2jnt\" (UID: \"692e13de-cdf9-45c0-bdb6-f94485ca4d5d\") " pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:37:53 crc kubenswrapper[4753]: I1005 20:37:53.043404 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-utilities\") pod \"redhat-marketplace-q2jnt\" (UID: \"692e13de-cdf9-45c0-bdb6-f94485ca4d5d\") " pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:37:53 crc kubenswrapper[4753]: I1005 20:37:53.066434 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v4fh\" (UniqueName: \"kubernetes.io/projected/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-kube-api-access-4v4fh\") pod \"redhat-marketplace-q2jnt\" (UID: \"692e13de-cdf9-45c0-bdb6-f94485ca4d5d\") " pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:37:53 crc kubenswrapper[4753]: I1005 20:37:53.206257 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:37:53 crc kubenswrapper[4753]: I1005 20:37:53.674335 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2jnt"] Oct 05 20:37:54 crc kubenswrapper[4753]: I1005 20:37:54.648426 4753 generic.go:334] "Generic (PLEG): container finished" podID="692e13de-cdf9-45c0-bdb6-f94485ca4d5d" containerID="c6a5879ade2d725730afe26b6203ccdf077ebbf6e86661c9187fb0c11b4c4585" exitCode=0 Oct 05 20:37:54 crc kubenswrapper[4753]: I1005 20:37:54.648514 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2jnt" event={"ID":"692e13de-cdf9-45c0-bdb6-f94485ca4d5d","Type":"ContainerDied","Data":"c6a5879ade2d725730afe26b6203ccdf077ebbf6e86661c9187fb0c11b4c4585"} Oct 05 20:37:54 crc kubenswrapper[4753]: I1005 20:37:54.649727 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2jnt" event={"ID":"692e13de-cdf9-45c0-bdb6-f94485ca4d5d","Type":"ContainerStarted","Data":"067a40eacb014a5e975beccaf95fd4786e53776a130793a1c1add4028fc25537"} Oct 05 20:37:55 crc kubenswrapper[4753]: I1005 20:37:55.659790 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2jnt" event={"ID":"692e13de-cdf9-45c0-bdb6-f94485ca4d5d","Type":"ContainerStarted","Data":"1279a50e11d467ff1f603e5c0401fc3751b7dcf5254bbf28c33fed0ce3947d82"} Oct 05 20:37:56 crc kubenswrapper[4753]: I1005 20:37:56.674659 4753 generic.go:334] "Generic (PLEG): container finished" podID="692e13de-cdf9-45c0-bdb6-f94485ca4d5d" containerID="1279a50e11d467ff1f603e5c0401fc3751b7dcf5254bbf28c33fed0ce3947d82" exitCode=0 Oct 05 20:37:56 crc kubenswrapper[4753]: I1005 20:37:56.674806 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2jnt" event={"ID":"692e13de-cdf9-45c0-bdb6-f94485ca4d5d","Type":"ContainerDied","Data":"1279a50e11d467ff1f603e5c0401fc3751b7dcf5254bbf28c33fed0ce3947d82"} Oct 05 20:37:57 crc kubenswrapper[4753]: I1005 20:37:57.691196 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2jnt" event={"ID":"692e13de-cdf9-45c0-bdb6-f94485ca4d5d","Type":"ContainerStarted","Data":"730f0595e38d0ca113fa21fc02cccfbc2eb2bc2c4b15a62a548155ed01f0e0d9"} Oct 05 20:37:57 crc kubenswrapper[4753]: I1005 20:37:57.725424 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q2jnt" podStartSLOduration=3.112982729 podStartE2EDuration="5.725401038s" podCreationTimestamp="2025-10-05 20:37:52 +0000 UTC" firstStartedPulling="2025-10-05 20:37:54.650243898 +0000 UTC m=+1383.498572130" lastFinishedPulling="2025-10-05 20:37:57.262662207 +0000 UTC m=+1386.110990439" observedRunningTime="2025-10-05 20:37:57.713299562 +0000 UTC m=+1386.561627794" watchObservedRunningTime="2025-10-05 20:37:57.725401038 +0000 UTC m=+1386.573729300" Oct 05 20:38:01 crc kubenswrapper[4753]: I1005 20:38:01.631690 4753 scope.go:117] "RemoveContainer" containerID="6e63058d2cb40f0e763d5fab64e729403984404fc3d63a216d05106ab19053ac" Oct 05 20:38:03 crc kubenswrapper[4753]: I1005 20:38:03.206994 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:38:03 crc kubenswrapper[4753]: I1005 20:38:03.207447 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:38:03 crc kubenswrapper[4753]: I1005 20:38:03.262368 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:38:03 crc kubenswrapper[4753]: I1005 20:38:03.812402 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:38:03 crc kubenswrapper[4753]: I1005 20:38:03.875498 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2jnt"] Oct 05 20:38:04 crc kubenswrapper[4753]: I1005 20:38:04.490786 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:38:04 crc kubenswrapper[4753]: I1005 20:38:04.491851 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:38:04 crc kubenswrapper[4753]: I1005 20:38:04.492007 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:38:04 crc kubenswrapper[4753]: I1005 20:38:04.492945 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9030d6a687aa30375abc82b21b6ddaee627312bc9d654f1ab51991af41da5fd6"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 20:38:04 crc kubenswrapper[4753]: I1005 20:38:04.493208 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://9030d6a687aa30375abc82b21b6ddaee627312bc9d654f1ab51991af41da5fd6" gracePeriod=600 Oct 05 20:38:04 crc kubenswrapper[4753]: I1005 20:38:04.764557 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="9030d6a687aa30375abc82b21b6ddaee627312bc9d654f1ab51991af41da5fd6" exitCode=0 Oct 05 20:38:04 crc kubenswrapper[4753]: I1005 20:38:04.764632 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"9030d6a687aa30375abc82b21b6ddaee627312bc9d654f1ab51991af41da5fd6"} Oct 05 20:38:04 crc kubenswrapper[4753]: I1005 20:38:04.764694 4753 scope.go:117] "RemoveContainer" containerID="6035d1f392035c4d85f80611de9c6434a6132b34d7431e03335a2036e9d1df2f" Oct 05 20:38:05 crc kubenswrapper[4753]: I1005 20:38:05.783995 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a"} Oct 05 20:38:05 crc kubenswrapper[4753]: I1005 20:38:05.784314 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q2jnt" podUID="692e13de-cdf9-45c0-bdb6-f94485ca4d5d" containerName="registry-server" containerID="cri-o://730f0595e38d0ca113fa21fc02cccfbc2eb2bc2c4b15a62a548155ed01f0e0d9" gracePeriod=2 Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.301210 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.441397 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v4fh\" (UniqueName: \"kubernetes.io/projected/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-kube-api-access-4v4fh\") pod \"692e13de-cdf9-45c0-bdb6-f94485ca4d5d\" (UID: \"692e13de-cdf9-45c0-bdb6-f94485ca4d5d\") " Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.441499 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-utilities\") pod \"692e13de-cdf9-45c0-bdb6-f94485ca4d5d\" (UID: \"692e13de-cdf9-45c0-bdb6-f94485ca4d5d\") " Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.441540 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-catalog-content\") pod \"692e13de-cdf9-45c0-bdb6-f94485ca4d5d\" (UID: \"692e13de-cdf9-45c0-bdb6-f94485ca4d5d\") " Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.442550 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-utilities" (OuterVolumeSpecName: "utilities") pod "692e13de-cdf9-45c0-bdb6-f94485ca4d5d" (UID: "692e13de-cdf9-45c0-bdb6-f94485ca4d5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.447960 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-kube-api-access-4v4fh" (OuterVolumeSpecName: "kube-api-access-4v4fh") pod "692e13de-cdf9-45c0-bdb6-f94485ca4d5d" (UID: "692e13de-cdf9-45c0-bdb6-f94485ca4d5d"). InnerVolumeSpecName "kube-api-access-4v4fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.457829 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "692e13de-cdf9-45c0-bdb6-f94485ca4d5d" (UID: "692e13de-cdf9-45c0-bdb6-f94485ca4d5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.543490 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.543880 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.543899 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v4fh\" (UniqueName: \"kubernetes.io/projected/692e13de-cdf9-45c0-bdb6-f94485ca4d5d-kube-api-access-4v4fh\") on node \"crc\" DevicePath \"\"" Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.801314 4753 generic.go:334] "Generic (PLEG): container finished" podID="692e13de-cdf9-45c0-bdb6-f94485ca4d5d" containerID="730f0595e38d0ca113fa21fc02cccfbc2eb2bc2c4b15a62a548155ed01f0e0d9" exitCode=0 Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.801366 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2jnt" event={"ID":"692e13de-cdf9-45c0-bdb6-f94485ca4d5d","Type":"ContainerDied","Data":"730f0595e38d0ca113fa21fc02cccfbc2eb2bc2c4b15a62a548155ed01f0e0d9"} Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.801418 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2jnt" event={"ID":"692e13de-cdf9-45c0-bdb6-f94485ca4d5d","Type":"ContainerDied","Data":"067a40eacb014a5e975beccaf95fd4786e53776a130793a1c1add4028fc25537"} Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.801435 4753 scope.go:117] "RemoveContainer" containerID="730f0595e38d0ca113fa21fc02cccfbc2eb2bc2c4b15a62a548155ed01f0e0d9" Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.801449 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2jnt" Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.835358 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2jnt"] Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.844247 4753 scope.go:117] "RemoveContainer" containerID="1279a50e11d467ff1f603e5c0401fc3751b7dcf5254bbf28c33fed0ce3947d82" Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.848390 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2jnt"] Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.882971 4753 scope.go:117] "RemoveContainer" containerID="c6a5879ade2d725730afe26b6203ccdf077ebbf6e86661c9187fb0c11b4c4585" Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.909223 4753 scope.go:117] "RemoveContainer" containerID="730f0595e38d0ca113fa21fc02cccfbc2eb2bc2c4b15a62a548155ed01f0e0d9" Oct 05 20:38:06 crc kubenswrapper[4753]: E1005 20:38:06.909956 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"730f0595e38d0ca113fa21fc02cccfbc2eb2bc2c4b15a62a548155ed01f0e0d9\": container with ID starting with 730f0595e38d0ca113fa21fc02cccfbc2eb2bc2c4b15a62a548155ed01f0e0d9 not found: ID does not exist" containerID="730f0595e38d0ca113fa21fc02cccfbc2eb2bc2c4b15a62a548155ed01f0e0d9" Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.909986 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"730f0595e38d0ca113fa21fc02cccfbc2eb2bc2c4b15a62a548155ed01f0e0d9"} err="failed to get container status \"730f0595e38d0ca113fa21fc02cccfbc2eb2bc2c4b15a62a548155ed01f0e0d9\": rpc error: code = NotFound desc = could not find container \"730f0595e38d0ca113fa21fc02cccfbc2eb2bc2c4b15a62a548155ed01f0e0d9\": container with ID starting with 730f0595e38d0ca113fa21fc02cccfbc2eb2bc2c4b15a62a548155ed01f0e0d9 not found: ID does not exist" Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.910026 4753 scope.go:117] "RemoveContainer" containerID="1279a50e11d467ff1f603e5c0401fc3751b7dcf5254bbf28c33fed0ce3947d82" Oct 05 20:38:06 crc kubenswrapper[4753]: E1005 20:38:06.910556 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1279a50e11d467ff1f603e5c0401fc3751b7dcf5254bbf28c33fed0ce3947d82\": container with ID starting with 1279a50e11d467ff1f603e5c0401fc3751b7dcf5254bbf28c33fed0ce3947d82 not found: ID does not exist" containerID="1279a50e11d467ff1f603e5c0401fc3751b7dcf5254bbf28c33fed0ce3947d82" Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.910602 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1279a50e11d467ff1f603e5c0401fc3751b7dcf5254bbf28c33fed0ce3947d82"} err="failed to get container status \"1279a50e11d467ff1f603e5c0401fc3751b7dcf5254bbf28c33fed0ce3947d82\": rpc error: code = NotFound desc = could not find container \"1279a50e11d467ff1f603e5c0401fc3751b7dcf5254bbf28c33fed0ce3947d82\": container with ID starting with 1279a50e11d467ff1f603e5c0401fc3751b7dcf5254bbf28c33fed0ce3947d82 not found: ID does not exist" Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.910633 4753 scope.go:117] "RemoveContainer" containerID="c6a5879ade2d725730afe26b6203ccdf077ebbf6e86661c9187fb0c11b4c4585" Oct 05 20:38:06 crc kubenswrapper[4753]: E1005 20:38:06.910993 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6a5879ade2d725730afe26b6203ccdf077ebbf6e86661c9187fb0c11b4c4585\": container with ID starting with c6a5879ade2d725730afe26b6203ccdf077ebbf6e86661c9187fb0c11b4c4585 not found: ID does not exist" containerID="c6a5879ade2d725730afe26b6203ccdf077ebbf6e86661c9187fb0c11b4c4585" Oct 05 20:38:06 crc kubenswrapper[4753]: I1005 20:38:06.911019 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6a5879ade2d725730afe26b6203ccdf077ebbf6e86661c9187fb0c11b4c4585"} err="failed to get container status \"c6a5879ade2d725730afe26b6203ccdf077ebbf6e86661c9187fb0c11b4c4585\": rpc error: code = NotFound desc = could not find container \"c6a5879ade2d725730afe26b6203ccdf077ebbf6e86661c9187fb0c11b4c4585\": container with ID starting with c6a5879ade2d725730afe26b6203ccdf077ebbf6e86661c9187fb0c11b4c4585 not found: ID does not exist" Oct 05 20:38:07 crc kubenswrapper[4753]: I1005 20:38:07.866397 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="692e13de-cdf9-45c0-bdb6-f94485ca4d5d" path="/var/lib/kubelet/pods/692e13de-cdf9-45c0-bdb6-f94485ca4d5d/volumes" Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.271604 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5wcxf"] Oct 05 20:38:18 crc kubenswrapper[4753]: E1005 20:38:18.272491 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="692e13de-cdf9-45c0-bdb6-f94485ca4d5d" containerName="registry-server" Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.272504 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="692e13de-cdf9-45c0-bdb6-f94485ca4d5d" containerName="registry-server" Oct 05 20:38:18 crc kubenswrapper[4753]: E1005 20:38:18.272518 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="692e13de-cdf9-45c0-bdb6-f94485ca4d5d" containerName="extract-content" Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.272524 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="692e13de-cdf9-45c0-bdb6-f94485ca4d5d" containerName="extract-content" Oct 05 20:38:18 crc kubenswrapper[4753]: E1005 20:38:18.272544 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="692e13de-cdf9-45c0-bdb6-f94485ca4d5d" containerName="extract-utilities" Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.272550 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="692e13de-cdf9-45c0-bdb6-f94485ca4d5d" containerName="extract-utilities" Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.272739 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="692e13de-cdf9-45c0-bdb6-f94485ca4d5d" containerName="registry-server" Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.274001 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.291110 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5wcxf"] Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.440788 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88960820-2f68-49b2-aa6a-68776c24cb27-utilities\") pod \"certified-operators-5wcxf\" (UID: \"88960820-2f68-49b2-aa6a-68776c24cb27\") " pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.441442 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88960820-2f68-49b2-aa6a-68776c24cb27-catalog-content\") pod \"certified-operators-5wcxf\" (UID: \"88960820-2f68-49b2-aa6a-68776c24cb27\") " pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.441622 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p5dz\" (UniqueName: \"kubernetes.io/projected/88960820-2f68-49b2-aa6a-68776c24cb27-kube-api-access-4p5dz\") pod \"certified-operators-5wcxf\" (UID: \"88960820-2f68-49b2-aa6a-68776c24cb27\") " pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.543062 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88960820-2f68-49b2-aa6a-68776c24cb27-catalog-content\") pod \"certified-operators-5wcxf\" (UID: \"88960820-2f68-49b2-aa6a-68776c24cb27\") " pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.543232 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p5dz\" (UniqueName: \"kubernetes.io/projected/88960820-2f68-49b2-aa6a-68776c24cb27-kube-api-access-4p5dz\") pod \"certified-operators-5wcxf\" (UID: \"88960820-2f68-49b2-aa6a-68776c24cb27\") " pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.543274 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88960820-2f68-49b2-aa6a-68776c24cb27-utilities\") pod \"certified-operators-5wcxf\" (UID: \"88960820-2f68-49b2-aa6a-68776c24cb27\") " pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.543600 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88960820-2f68-49b2-aa6a-68776c24cb27-catalog-content\") pod \"certified-operators-5wcxf\" (UID: \"88960820-2f68-49b2-aa6a-68776c24cb27\") " pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.543640 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88960820-2f68-49b2-aa6a-68776c24cb27-utilities\") pod \"certified-operators-5wcxf\" (UID: \"88960820-2f68-49b2-aa6a-68776c24cb27\") " pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.565809 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p5dz\" (UniqueName: \"kubernetes.io/projected/88960820-2f68-49b2-aa6a-68776c24cb27-kube-api-access-4p5dz\") pod \"certified-operators-5wcxf\" (UID: \"88960820-2f68-49b2-aa6a-68776c24cb27\") " pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:18 crc kubenswrapper[4753]: I1005 20:38:18.607409 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:19 crc kubenswrapper[4753]: I1005 20:38:19.100516 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5wcxf"] Oct 05 20:38:19 crc kubenswrapper[4753]: I1005 20:38:19.919226 4753 generic.go:334] "Generic (PLEG): container finished" podID="88960820-2f68-49b2-aa6a-68776c24cb27" containerID="aaacb92e279971845cb2ece06dec39974e42f334eade578c40bac890e7714cb2" exitCode=0 Oct 05 20:38:19 crc kubenswrapper[4753]: I1005 20:38:19.920234 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wcxf" event={"ID":"88960820-2f68-49b2-aa6a-68776c24cb27","Type":"ContainerDied","Data":"aaacb92e279971845cb2ece06dec39974e42f334eade578c40bac890e7714cb2"} Oct 05 20:38:19 crc kubenswrapper[4753]: I1005 20:38:19.920286 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wcxf" event={"ID":"88960820-2f68-49b2-aa6a-68776c24cb27","Type":"ContainerStarted","Data":"0854e962907b21628f01b91c4e4d60b9c44c8b787bb6d9a9432e1c382ca246ae"} Oct 05 20:38:21 crc kubenswrapper[4753]: I1005 20:38:21.936132 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wcxf" event={"ID":"88960820-2f68-49b2-aa6a-68776c24cb27","Type":"ContainerStarted","Data":"2c8438bd0f0847beb24ef6c267a68d2ea4ff37cdc86b8cf209d75d85b481599d"} Oct 05 20:38:22 crc kubenswrapper[4753]: I1005 20:38:22.946225 4753 generic.go:334] "Generic (PLEG): container finished" podID="88960820-2f68-49b2-aa6a-68776c24cb27" containerID="2c8438bd0f0847beb24ef6c267a68d2ea4ff37cdc86b8cf209d75d85b481599d" exitCode=0 Oct 05 20:38:22 crc kubenswrapper[4753]: I1005 20:38:22.946275 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wcxf" event={"ID":"88960820-2f68-49b2-aa6a-68776c24cb27","Type":"ContainerDied","Data":"2c8438bd0f0847beb24ef6c267a68d2ea4ff37cdc86b8cf209d75d85b481599d"} Oct 05 20:38:23 crc kubenswrapper[4753]: I1005 20:38:23.957081 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wcxf" event={"ID":"88960820-2f68-49b2-aa6a-68776c24cb27","Type":"ContainerStarted","Data":"13308c942119d75692694959d64dd3f876bc494ddbc7022cbe249150bb55f154"} Oct 05 20:38:23 crc kubenswrapper[4753]: I1005 20:38:23.976315 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5wcxf" podStartSLOduration=2.524960855 podStartE2EDuration="5.976296603s" podCreationTimestamp="2025-10-05 20:38:18 +0000 UTC" firstStartedPulling="2025-10-05 20:38:19.922322016 +0000 UTC m=+1408.770650248" lastFinishedPulling="2025-10-05 20:38:23.373657764 +0000 UTC m=+1412.221985996" observedRunningTime="2025-10-05 20:38:23.970839134 +0000 UTC m=+1412.819167396" watchObservedRunningTime="2025-10-05 20:38:23.976296603 +0000 UTC m=+1412.824624825" Oct 05 20:38:28 crc kubenswrapper[4753]: I1005 20:38:28.608294 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:28 crc kubenswrapper[4753]: I1005 20:38:28.608906 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:28 crc kubenswrapper[4753]: I1005 20:38:28.660930 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:29 crc kubenswrapper[4753]: I1005 20:38:29.039105 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:29 crc kubenswrapper[4753]: I1005 20:38:29.088389 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5wcxf"] Oct 05 20:38:31 crc kubenswrapper[4753]: I1005 20:38:31.015377 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5wcxf" podUID="88960820-2f68-49b2-aa6a-68776c24cb27" containerName="registry-server" containerID="cri-o://13308c942119d75692694959d64dd3f876bc494ddbc7022cbe249150bb55f154" gracePeriod=2 Oct 05 20:38:31 crc kubenswrapper[4753]: I1005 20:38:31.504083 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:31 crc kubenswrapper[4753]: I1005 20:38:31.685430 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88960820-2f68-49b2-aa6a-68776c24cb27-catalog-content\") pod \"88960820-2f68-49b2-aa6a-68776c24cb27\" (UID: \"88960820-2f68-49b2-aa6a-68776c24cb27\") " Oct 05 20:38:31 crc kubenswrapper[4753]: I1005 20:38:31.685841 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88960820-2f68-49b2-aa6a-68776c24cb27-utilities\") pod \"88960820-2f68-49b2-aa6a-68776c24cb27\" (UID: \"88960820-2f68-49b2-aa6a-68776c24cb27\") " Oct 05 20:38:31 crc kubenswrapper[4753]: I1005 20:38:31.685908 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p5dz\" (UniqueName: \"kubernetes.io/projected/88960820-2f68-49b2-aa6a-68776c24cb27-kube-api-access-4p5dz\") pod \"88960820-2f68-49b2-aa6a-68776c24cb27\" (UID: \"88960820-2f68-49b2-aa6a-68776c24cb27\") " Oct 05 20:38:31 crc kubenswrapper[4753]: I1005 20:38:31.686416 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88960820-2f68-49b2-aa6a-68776c24cb27-utilities" (OuterVolumeSpecName: "utilities") pod "88960820-2f68-49b2-aa6a-68776c24cb27" (UID: "88960820-2f68-49b2-aa6a-68776c24cb27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:38:31 crc kubenswrapper[4753]: I1005 20:38:31.702809 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88960820-2f68-49b2-aa6a-68776c24cb27-kube-api-access-4p5dz" (OuterVolumeSpecName: "kube-api-access-4p5dz") pod "88960820-2f68-49b2-aa6a-68776c24cb27" (UID: "88960820-2f68-49b2-aa6a-68776c24cb27"). InnerVolumeSpecName "kube-api-access-4p5dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:38:31 crc kubenswrapper[4753]: I1005 20:38:31.732547 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88960820-2f68-49b2-aa6a-68776c24cb27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88960820-2f68-49b2-aa6a-68776c24cb27" (UID: "88960820-2f68-49b2-aa6a-68776c24cb27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:38:31 crc kubenswrapper[4753]: I1005 20:38:31.787865 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88960820-2f68-49b2-aa6a-68776c24cb27-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:38:31 crc kubenswrapper[4753]: I1005 20:38:31.787900 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88960820-2f68-49b2-aa6a-68776c24cb27-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:38:31 crc kubenswrapper[4753]: I1005 20:38:31.787911 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p5dz\" (UniqueName: \"kubernetes.io/projected/88960820-2f68-49b2-aa6a-68776c24cb27-kube-api-access-4p5dz\") on node \"crc\" DevicePath \"\"" Oct 05 20:38:32 crc kubenswrapper[4753]: I1005 20:38:32.024669 4753 generic.go:334] "Generic (PLEG): container finished" podID="88960820-2f68-49b2-aa6a-68776c24cb27" containerID="13308c942119d75692694959d64dd3f876bc494ddbc7022cbe249150bb55f154" exitCode=0 Oct 05 20:38:32 crc kubenswrapper[4753]: I1005 20:38:32.024732 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wcxf" event={"ID":"88960820-2f68-49b2-aa6a-68776c24cb27","Type":"ContainerDied","Data":"13308c942119d75692694959d64dd3f876bc494ddbc7022cbe249150bb55f154"} Oct 05 20:38:32 crc kubenswrapper[4753]: I1005 20:38:32.024762 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5wcxf" event={"ID":"88960820-2f68-49b2-aa6a-68776c24cb27","Type":"ContainerDied","Data":"0854e962907b21628f01b91c4e4d60b9c44c8b787bb6d9a9432e1c382ca246ae"} Oct 05 20:38:32 crc kubenswrapper[4753]: I1005 20:38:32.024779 4753 scope.go:117] "RemoveContainer" containerID="13308c942119d75692694959d64dd3f876bc494ddbc7022cbe249150bb55f154" Oct 05 20:38:32 crc kubenswrapper[4753]: I1005 20:38:32.024806 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5wcxf" Oct 05 20:38:32 crc kubenswrapper[4753]: I1005 20:38:32.047659 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5wcxf"] Oct 05 20:38:32 crc kubenswrapper[4753]: I1005 20:38:32.062581 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5wcxf"] Oct 05 20:38:32 crc kubenswrapper[4753]: I1005 20:38:32.065841 4753 scope.go:117] "RemoveContainer" containerID="2c8438bd0f0847beb24ef6c267a68d2ea4ff37cdc86b8cf209d75d85b481599d" Oct 05 20:38:32 crc kubenswrapper[4753]: I1005 20:38:32.087418 4753 scope.go:117] "RemoveContainer" containerID="aaacb92e279971845cb2ece06dec39974e42f334eade578c40bac890e7714cb2" Oct 05 20:38:32 crc kubenswrapper[4753]: I1005 20:38:32.127812 4753 scope.go:117] "RemoveContainer" containerID="13308c942119d75692694959d64dd3f876bc494ddbc7022cbe249150bb55f154" Oct 05 20:38:32 crc kubenswrapper[4753]: E1005 20:38:32.128784 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13308c942119d75692694959d64dd3f876bc494ddbc7022cbe249150bb55f154\": container with ID starting with 13308c942119d75692694959d64dd3f876bc494ddbc7022cbe249150bb55f154 not found: ID does not exist" containerID="13308c942119d75692694959d64dd3f876bc494ddbc7022cbe249150bb55f154" Oct 05 20:38:32 crc kubenswrapper[4753]: I1005 20:38:32.128824 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13308c942119d75692694959d64dd3f876bc494ddbc7022cbe249150bb55f154"} err="failed to get container status \"13308c942119d75692694959d64dd3f876bc494ddbc7022cbe249150bb55f154\": rpc error: code = NotFound desc = could not find container \"13308c942119d75692694959d64dd3f876bc494ddbc7022cbe249150bb55f154\": container with ID starting with 13308c942119d75692694959d64dd3f876bc494ddbc7022cbe249150bb55f154 not found: ID does not exist" Oct 05 20:38:32 crc kubenswrapper[4753]: I1005 20:38:32.128848 4753 scope.go:117] "RemoveContainer" containerID="2c8438bd0f0847beb24ef6c267a68d2ea4ff37cdc86b8cf209d75d85b481599d" Oct 05 20:38:32 crc kubenswrapper[4753]: E1005 20:38:32.129338 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c8438bd0f0847beb24ef6c267a68d2ea4ff37cdc86b8cf209d75d85b481599d\": container with ID starting with 2c8438bd0f0847beb24ef6c267a68d2ea4ff37cdc86b8cf209d75d85b481599d not found: ID does not exist" containerID="2c8438bd0f0847beb24ef6c267a68d2ea4ff37cdc86b8cf209d75d85b481599d" Oct 05 20:38:32 crc kubenswrapper[4753]: I1005 20:38:32.129367 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c8438bd0f0847beb24ef6c267a68d2ea4ff37cdc86b8cf209d75d85b481599d"} err="failed to get container status \"2c8438bd0f0847beb24ef6c267a68d2ea4ff37cdc86b8cf209d75d85b481599d\": rpc error: code = NotFound desc = could not find container \"2c8438bd0f0847beb24ef6c267a68d2ea4ff37cdc86b8cf209d75d85b481599d\": container with ID starting with 2c8438bd0f0847beb24ef6c267a68d2ea4ff37cdc86b8cf209d75d85b481599d not found: ID does not exist" Oct 05 20:38:32 crc kubenswrapper[4753]: I1005 20:38:32.129389 4753 scope.go:117] "RemoveContainer" containerID="aaacb92e279971845cb2ece06dec39974e42f334eade578c40bac890e7714cb2" Oct 05 20:38:32 crc kubenswrapper[4753]: E1005 20:38:32.129708 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaacb92e279971845cb2ece06dec39974e42f334eade578c40bac890e7714cb2\": container with ID starting with aaacb92e279971845cb2ece06dec39974e42f334eade578c40bac890e7714cb2 not found: ID does not exist" containerID="aaacb92e279971845cb2ece06dec39974e42f334eade578c40bac890e7714cb2" Oct 05 20:38:32 crc kubenswrapper[4753]: I1005 20:38:32.129727 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaacb92e279971845cb2ece06dec39974e42f334eade578c40bac890e7714cb2"} err="failed to get container status \"aaacb92e279971845cb2ece06dec39974e42f334eade578c40bac890e7714cb2\": rpc error: code = NotFound desc = could not find container \"aaacb92e279971845cb2ece06dec39974e42f334eade578c40bac890e7714cb2\": container with ID starting with aaacb92e279971845cb2ece06dec39974e42f334eade578c40bac890e7714cb2 not found: ID does not exist" Oct 05 20:38:33 crc kubenswrapper[4753]: I1005 20:38:33.865892 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88960820-2f68-49b2-aa6a-68776c24cb27" path="/var/lib/kubelet/pods/88960820-2f68-49b2-aa6a-68776c24cb27/volumes" Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.340281 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hnt7x"] Oct 05 20:38:40 crc kubenswrapper[4753]: E1005 20:38:40.341325 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88960820-2f68-49b2-aa6a-68776c24cb27" containerName="registry-server" Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.341340 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="88960820-2f68-49b2-aa6a-68776c24cb27" containerName="registry-server" Oct 05 20:38:40 crc kubenswrapper[4753]: E1005 20:38:40.341350 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88960820-2f68-49b2-aa6a-68776c24cb27" containerName="extract-utilities" Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.341356 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="88960820-2f68-49b2-aa6a-68776c24cb27" containerName="extract-utilities" Oct 05 20:38:40 crc kubenswrapper[4753]: E1005 20:38:40.341387 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88960820-2f68-49b2-aa6a-68776c24cb27" containerName="extract-content" Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.341394 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="88960820-2f68-49b2-aa6a-68776c24cb27" containerName="extract-content" Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.341602 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="88960820-2f68-49b2-aa6a-68776c24cb27" containerName="registry-server" Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.345244 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.377613 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hnt7x"] Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.450570 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66cdb\" (UniqueName: \"kubernetes.io/projected/e49df552-03b5-4969-a848-e3f36d630cac-kube-api-access-66cdb\") pod \"redhat-operators-hnt7x\" (UID: \"e49df552-03b5-4969-a848-e3f36d630cac\") " pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.450636 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e49df552-03b5-4969-a848-e3f36d630cac-utilities\") pod \"redhat-operators-hnt7x\" (UID: \"e49df552-03b5-4969-a848-e3f36d630cac\") " pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.451057 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e49df552-03b5-4969-a848-e3f36d630cac-catalog-content\") pod \"redhat-operators-hnt7x\" (UID: \"e49df552-03b5-4969-a848-e3f36d630cac\") " pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.552921 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e49df552-03b5-4969-a848-e3f36d630cac-utilities\") pod \"redhat-operators-hnt7x\" (UID: \"e49df552-03b5-4969-a848-e3f36d630cac\") " pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.553040 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e49df552-03b5-4969-a848-e3f36d630cac-catalog-content\") pod \"redhat-operators-hnt7x\" (UID: \"e49df552-03b5-4969-a848-e3f36d630cac\") " pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.553119 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66cdb\" (UniqueName: \"kubernetes.io/projected/e49df552-03b5-4969-a848-e3f36d630cac-kube-api-access-66cdb\") pod \"redhat-operators-hnt7x\" (UID: \"e49df552-03b5-4969-a848-e3f36d630cac\") " pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.553481 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e49df552-03b5-4969-a848-e3f36d630cac-utilities\") pod \"redhat-operators-hnt7x\" (UID: \"e49df552-03b5-4969-a848-e3f36d630cac\") " pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.553588 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e49df552-03b5-4969-a848-e3f36d630cac-catalog-content\") pod \"redhat-operators-hnt7x\" (UID: \"e49df552-03b5-4969-a848-e3f36d630cac\") " pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.572001 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66cdb\" (UniqueName: \"kubernetes.io/projected/e49df552-03b5-4969-a848-e3f36d630cac-kube-api-access-66cdb\") pod \"redhat-operators-hnt7x\" (UID: \"e49df552-03b5-4969-a848-e3f36d630cac\") " pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:38:40 crc kubenswrapper[4753]: I1005 20:38:40.679543 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:38:41 crc kubenswrapper[4753]: I1005 20:38:41.202843 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hnt7x"] Oct 05 20:38:42 crc kubenswrapper[4753]: I1005 20:38:42.132132 4753 generic.go:334] "Generic (PLEG): container finished" podID="e49df552-03b5-4969-a848-e3f36d630cac" containerID="f44f66021d24a077b79d54db89527464c970da7f01d4aaf2ae90c2b11e95d38a" exitCode=0 Oct 05 20:38:42 crc kubenswrapper[4753]: I1005 20:38:42.132219 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnt7x" event={"ID":"e49df552-03b5-4969-a848-e3f36d630cac","Type":"ContainerDied","Data":"f44f66021d24a077b79d54db89527464c970da7f01d4aaf2ae90c2b11e95d38a"} Oct 05 20:38:42 crc kubenswrapper[4753]: I1005 20:38:42.132569 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnt7x" event={"ID":"e49df552-03b5-4969-a848-e3f36d630cac","Type":"ContainerStarted","Data":"b2a9b22997371aaf7f07187d9a1fa7db3b60f232188f07d6f7a6e075014a958c"} Oct 05 20:38:43 crc kubenswrapper[4753]: I1005 20:38:43.143020 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnt7x" event={"ID":"e49df552-03b5-4969-a848-e3f36d630cac","Type":"ContainerStarted","Data":"c736eb62e7f9afe4b6c45966e49f408b20906771937c77ab23fd586263f5ddab"} Oct 05 20:38:47 crc kubenswrapper[4753]: I1005 20:38:47.179484 4753 generic.go:334] "Generic (PLEG): container finished" podID="e49df552-03b5-4969-a848-e3f36d630cac" containerID="c736eb62e7f9afe4b6c45966e49f408b20906771937c77ab23fd586263f5ddab" exitCode=0 Oct 05 20:38:47 crc kubenswrapper[4753]: I1005 20:38:47.179595 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnt7x" event={"ID":"e49df552-03b5-4969-a848-e3f36d630cac","Type":"ContainerDied","Data":"c736eb62e7f9afe4b6c45966e49f408b20906771937c77ab23fd586263f5ddab"} Oct 05 20:38:48 crc kubenswrapper[4753]: I1005 20:38:48.189361 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnt7x" event={"ID":"e49df552-03b5-4969-a848-e3f36d630cac","Type":"ContainerStarted","Data":"2ab58fa0546d24713e67e1fa9c40de91dd93c161c36439b103cd60fc2dfb1fc7"} Oct 05 20:38:48 crc kubenswrapper[4753]: I1005 20:38:48.222465 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hnt7x" podStartSLOduration=2.710795706 podStartE2EDuration="8.222447649s" podCreationTimestamp="2025-10-05 20:38:40 +0000 UTC" firstStartedPulling="2025-10-05 20:38:42.134315037 +0000 UTC m=+1430.982643269" lastFinishedPulling="2025-10-05 20:38:47.64596698 +0000 UTC m=+1436.494295212" observedRunningTime="2025-10-05 20:38:48.215224415 +0000 UTC m=+1437.063552647" watchObservedRunningTime="2025-10-05 20:38:48.222447649 +0000 UTC m=+1437.070775881" Oct 05 20:38:50 crc kubenswrapper[4753]: I1005 20:38:50.679868 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:38:50 crc kubenswrapper[4753]: I1005 20:38:50.680955 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:38:51 crc kubenswrapper[4753]: I1005 20:38:51.733103 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hnt7x" podUID="e49df552-03b5-4969-a848-e3f36d630cac" containerName="registry-server" probeResult="failure" output=< Oct 05 20:38:51 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 20:38:51 crc kubenswrapper[4753]: > Oct 05 20:39:00 crc kubenswrapper[4753]: I1005 20:39:00.734204 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:39:00 crc kubenswrapper[4753]: I1005 20:39:00.792812 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:39:00 crc kubenswrapper[4753]: I1005 20:39:00.970457 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hnt7x"] Oct 05 20:39:02 crc kubenswrapper[4753]: I1005 20:39:02.313613 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hnt7x" podUID="e49df552-03b5-4969-a848-e3f36d630cac" containerName="registry-server" containerID="cri-o://2ab58fa0546d24713e67e1fa9c40de91dd93c161c36439b103cd60fc2dfb1fc7" gracePeriod=2 Oct 05 20:39:02 crc kubenswrapper[4753]: I1005 20:39:02.782413 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:39:02 crc kubenswrapper[4753]: I1005 20:39:02.979033 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e49df552-03b5-4969-a848-e3f36d630cac-utilities\") pod \"e49df552-03b5-4969-a848-e3f36d630cac\" (UID: \"e49df552-03b5-4969-a848-e3f36d630cac\") " Oct 05 20:39:02 crc kubenswrapper[4753]: I1005 20:39:02.979167 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66cdb\" (UniqueName: \"kubernetes.io/projected/e49df552-03b5-4969-a848-e3f36d630cac-kube-api-access-66cdb\") pod \"e49df552-03b5-4969-a848-e3f36d630cac\" (UID: \"e49df552-03b5-4969-a848-e3f36d630cac\") " Oct 05 20:39:02 crc kubenswrapper[4753]: I1005 20:39:02.979226 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e49df552-03b5-4969-a848-e3f36d630cac-catalog-content\") pod \"e49df552-03b5-4969-a848-e3f36d630cac\" (UID: \"e49df552-03b5-4969-a848-e3f36d630cac\") " Oct 05 20:39:02 crc kubenswrapper[4753]: I1005 20:39:02.980334 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49df552-03b5-4969-a848-e3f36d630cac-utilities" (OuterVolumeSpecName: "utilities") pod "e49df552-03b5-4969-a848-e3f36d630cac" (UID: "e49df552-03b5-4969-a848-e3f36d630cac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:39:02 crc kubenswrapper[4753]: I1005 20:39:02.989524 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e49df552-03b5-4969-a848-e3f36d630cac-kube-api-access-66cdb" (OuterVolumeSpecName: "kube-api-access-66cdb") pod "e49df552-03b5-4969-a848-e3f36d630cac" (UID: "e49df552-03b5-4969-a848-e3f36d630cac"). InnerVolumeSpecName "kube-api-access-66cdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.060127 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49df552-03b5-4969-a848-e3f36d630cac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e49df552-03b5-4969-a848-e3f36d630cac" (UID: "e49df552-03b5-4969-a848-e3f36d630cac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.080786 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e49df552-03b5-4969-a848-e3f36d630cac-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.080822 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66cdb\" (UniqueName: \"kubernetes.io/projected/e49df552-03b5-4969-a848-e3f36d630cac-kube-api-access-66cdb\") on node \"crc\" DevicePath \"\"" Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.080836 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e49df552-03b5-4969-a848-e3f36d630cac-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.324106 4753 generic.go:334] "Generic (PLEG): container finished" podID="e49df552-03b5-4969-a848-e3f36d630cac" containerID="2ab58fa0546d24713e67e1fa9c40de91dd93c161c36439b103cd60fc2dfb1fc7" exitCode=0 Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.324206 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hnt7x" Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.324241 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnt7x" event={"ID":"e49df552-03b5-4969-a848-e3f36d630cac","Type":"ContainerDied","Data":"2ab58fa0546d24713e67e1fa9c40de91dd93c161c36439b103cd60fc2dfb1fc7"} Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.325111 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnt7x" event={"ID":"e49df552-03b5-4969-a848-e3f36d630cac","Type":"ContainerDied","Data":"b2a9b22997371aaf7f07187d9a1fa7db3b60f232188f07d6f7a6e075014a958c"} Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.325134 4753 scope.go:117] "RemoveContainer" containerID="2ab58fa0546d24713e67e1fa9c40de91dd93c161c36439b103cd60fc2dfb1fc7" Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.355004 4753 scope.go:117] "RemoveContainer" containerID="c736eb62e7f9afe4b6c45966e49f408b20906771937c77ab23fd586263f5ddab" Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.355740 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hnt7x"] Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.370425 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hnt7x"] Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.381815 4753 scope.go:117] "RemoveContainer" containerID="f44f66021d24a077b79d54db89527464c970da7f01d4aaf2ae90c2b11e95d38a" Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.431278 4753 scope.go:117] "RemoveContainer" containerID="2ab58fa0546d24713e67e1fa9c40de91dd93c161c36439b103cd60fc2dfb1fc7" Oct 05 20:39:03 crc kubenswrapper[4753]: E1005 20:39:03.432245 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ab58fa0546d24713e67e1fa9c40de91dd93c161c36439b103cd60fc2dfb1fc7\": container with ID starting with 2ab58fa0546d24713e67e1fa9c40de91dd93c161c36439b103cd60fc2dfb1fc7 not found: ID does not exist" containerID="2ab58fa0546d24713e67e1fa9c40de91dd93c161c36439b103cd60fc2dfb1fc7" Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.432279 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab58fa0546d24713e67e1fa9c40de91dd93c161c36439b103cd60fc2dfb1fc7"} err="failed to get container status \"2ab58fa0546d24713e67e1fa9c40de91dd93c161c36439b103cd60fc2dfb1fc7\": rpc error: code = NotFound desc = could not find container \"2ab58fa0546d24713e67e1fa9c40de91dd93c161c36439b103cd60fc2dfb1fc7\": container with ID starting with 2ab58fa0546d24713e67e1fa9c40de91dd93c161c36439b103cd60fc2dfb1fc7 not found: ID does not exist" Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.432303 4753 scope.go:117] "RemoveContainer" containerID="c736eb62e7f9afe4b6c45966e49f408b20906771937c77ab23fd586263f5ddab" Oct 05 20:39:03 crc kubenswrapper[4753]: E1005 20:39:03.432630 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c736eb62e7f9afe4b6c45966e49f408b20906771937c77ab23fd586263f5ddab\": container with ID starting with c736eb62e7f9afe4b6c45966e49f408b20906771937c77ab23fd586263f5ddab not found: ID does not exist" containerID="c736eb62e7f9afe4b6c45966e49f408b20906771937c77ab23fd586263f5ddab" Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.432659 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c736eb62e7f9afe4b6c45966e49f408b20906771937c77ab23fd586263f5ddab"} err="failed to get container status \"c736eb62e7f9afe4b6c45966e49f408b20906771937c77ab23fd586263f5ddab\": rpc error: code = NotFound desc = could not find container \"c736eb62e7f9afe4b6c45966e49f408b20906771937c77ab23fd586263f5ddab\": container with ID starting with c736eb62e7f9afe4b6c45966e49f408b20906771937c77ab23fd586263f5ddab not found: ID does not exist" Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.432675 4753 scope.go:117] "RemoveContainer" containerID="f44f66021d24a077b79d54db89527464c970da7f01d4aaf2ae90c2b11e95d38a" Oct 05 20:39:03 crc kubenswrapper[4753]: E1005 20:39:03.432910 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f44f66021d24a077b79d54db89527464c970da7f01d4aaf2ae90c2b11e95d38a\": container with ID starting with f44f66021d24a077b79d54db89527464c970da7f01d4aaf2ae90c2b11e95d38a not found: ID does not exist" containerID="f44f66021d24a077b79d54db89527464c970da7f01d4aaf2ae90c2b11e95d38a" Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.432931 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f44f66021d24a077b79d54db89527464c970da7f01d4aaf2ae90c2b11e95d38a"} err="failed to get container status \"f44f66021d24a077b79d54db89527464c970da7f01d4aaf2ae90c2b11e95d38a\": rpc error: code = NotFound desc = could not find container \"f44f66021d24a077b79d54db89527464c970da7f01d4aaf2ae90c2b11e95d38a\": container with ID starting with f44f66021d24a077b79d54db89527464c970da7f01d4aaf2ae90c2b11e95d38a not found: ID does not exist" Oct 05 20:39:03 crc kubenswrapper[4753]: I1005 20:39:03.884771 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e49df552-03b5-4969-a848-e3f36d630cac" path="/var/lib/kubelet/pods/e49df552-03b5-4969-a848-e3f36d630cac/volumes" Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.516855 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gqdbb"] Oct 05 20:39:15 crc kubenswrapper[4753]: E1005 20:39:15.517843 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49df552-03b5-4969-a848-e3f36d630cac" containerName="registry-server" Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.517859 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49df552-03b5-4969-a848-e3f36d630cac" containerName="registry-server" Oct 05 20:39:15 crc kubenswrapper[4753]: E1005 20:39:15.517875 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49df552-03b5-4969-a848-e3f36d630cac" containerName="extract-content" Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.517883 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49df552-03b5-4969-a848-e3f36d630cac" containerName="extract-content" Oct 05 20:39:15 crc kubenswrapper[4753]: E1005 20:39:15.517895 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49df552-03b5-4969-a848-e3f36d630cac" containerName="extract-utilities" Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.517902 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49df552-03b5-4969-a848-e3f36d630cac" containerName="extract-utilities" Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.518168 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="e49df552-03b5-4969-a848-e3f36d630cac" containerName="registry-server" Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.519674 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.529660 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gqdbb"] Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.705438 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08c53741-7d89-4945-b9c8-4710e743a473-utilities\") pod \"community-operators-gqdbb\" (UID: \"08c53741-7d89-4945-b9c8-4710e743a473\") " pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.705504 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j48cn\" (UniqueName: \"kubernetes.io/projected/08c53741-7d89-4945-b9c8-4710e743a473-kube-api-access-j48cn\") pod \"community-operators-gqdbb\" (UID: \"08c53741-7d89-4945-b9c8-4710e743a473\") " pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.705583 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08c53741-7d89-4945-b9c8-4710e743a473-catalog-content\") pod \"community-operators-gqdbb\" (UID: \"08c53741-7d89-4945-b9c8-4710e743a473\") " pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.806928 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08c53741-7d89-4945-b9c8-4710e743a473-utilities\") pod \"community-operators-gqdbb\" (UID: \"08c53741-7d89-4945-b9c8-4710e743a473\") " pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.806987 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j48cn\" (UniqueName: \"kubernetes.io/projected/08c53741-7d89-4945-b9c8-4710e743a473-kube-api-access-j48cn\") pod \"community-operators-gqdbb\" (UID: \"08c53741-7d89-4945-b9c8-4710e743a473\") " pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.807053 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08c53741-7d89-4945-b9c8-4710e743a473-catalog-content\") pod \"community-operators-gqdbb\" (UID: \"08c53741-7d89-4945-b9c8-4710e743a473\") " pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.807542 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08c53741-7d89-4945-b9c8-4710e743a473-catalog-content\") pod \"community-operators-gqdbb\" (UID: \"08c53741-7d89-4945-b9c8-4710e743a473\") " pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.807822 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08c53741-7d89-4945-b9c8-4710e743a473-utilities\") pod \"community-operators-gqdbb\" (UID: \"08c53741-7d89-4945-b9c8-4710e743a473\") " pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.827069 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j48cn\" (UniqueName: \"kubernetes.io/projected/08c53741-7d89-4945-b9c8-4710e743a473-kube-api-access-j48cn\") pod \"community-operators-gqdbb\" (UID: \"08c53741-7d89-4945-b9c8-4710e743a473\") " pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:15 crc kubenswrapper[4753]: I1005 20:39:15.845381 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:16 crc kubenswrapper[4753]: I1005 20:39:16.338813 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gqdbb"] Oct 05 20:39:16 crc kubenswrapper[4753]: I1005 20:39:16.441732 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqdbb" event={"ID":"08c53741-7d89-4945-b9c8-4710e743a473","Type":"ContainerStarted","Data":"496236ded9202939569071e4d270af012b8a2238f8dd99f1d0669f769eaa8b23"} Oct 05 20:39:17 crc kubenswrapper[4753]: I1005 20:39:17.474797 4753 generic.go:334] "Generic (PLEG): container finished" podID="08c53741-7d89-4945-b9c8-4710e743a473" containerID="bf12ca307235ab07bd1538ee77497ded7022925067885890193e69ccd50230f6" exitCode=0 Oct 05 20:39:17 crc kubenswrapper[4753]: I1005 20:39:17.475092 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqdbb" event={"ID":"08c53741-7d89-4945-b9c8-4710e743a473","Type":"ContainerDied","Data":"bf12ca307235ab07bd1538ee77497ded7022925067885890193e69ccd50230f6"} Oct 05 20:39:19 crc kubenswrapper[4753]: I1005 20:39:19.492895 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqdbb" event={"ID":"08c53741-7d89-4945-b9c8-4710e743a473","Type":"ContainerStarted","Data":"e221039b42542a8f7496df8af6de7434248bb50a53f362bb7b47951cc1d65159"} Oct 05 20:39:20 crc kubenswrapper[4753]: I1005 20:39:20.506022 4753 generic.go:334] "Generic (PLEG): container finished" podID="08c53741-7d89-4945-b9c8-4710e743a473" containerID="e221039b42542a8f7496df8af6de7434248bb50a53f362bb7b47951cc1d65159" exitCode=0 Oct 05 20:39:20 crc kubenswrapper[4753]: I1005 20:39:20.506116 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqdbb" event={"ID":"08c53741-7d89-4945-b9c8-4710e743a473","Type":"ContainerDied","Data":"e221039b42542a8f7496df8af6de7434248bb50a53f362bb7b47951cc1d65159"} Oct 05 20:39:21 crc kubenswrapper[4753]: I1005 20:39:21.557454 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqdbb" event={"ID":"08c53741-7d89-4945-b9c8-4710e743a473","Type":"ContainerStarted","Data":"82af7688a0ba9f175f6a35ee3ef1e83ce0a25ecbbc6f0f87f69b3cef188fa584"} Oct 05 20:39:21 crc kubenswrapper[4753]: I1005 20:39:21.560917 4753 generic.go:334] "Generic (PLEG): container finished" podID="03feab36-54ee-4968-a9d9-841cbd059c48" containerID="99084cf66eb920dccd49c79c0cced862601fb3880ba396b1d3cc4ecc013c55c7" exitCode=0 Oct 05 20:39:21 crc kubenswrapper[4753]: I1005 20:39:21.560958 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" event={"ID":"03feab36-54ee-4968-a9d9-841cbd059c48","Type":"ContainerDied","Data":"99084cf66eb920dccd49c79c0cced862601fb3880ba396b1d3cc4ecc013c55c7"} Oct 05 20:39:21 crc kubenswrapper[4753]: I1005 20:39:21.622602 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gqdbb" podStartSLOduration=3.196392455 podStartE2EDuration="6.622581917s" podCreationTimestamp="2025-10-05 20:39:15 +0000 UTC" firstStartedPulling="2025-10-05 20:39:17.477317703 +0000 UTC m=+1466.325645945" lastFinishedPulling="2025-10-05 20:39:20.903507175 +0000 UTC m=+1469.751835407" observedRunningTime="2025-10-05 20:39:21.57547972 +0000 UTC m=+1470.423807962" watchObservedRunningTime="2025-10-05 20:39:21.622581917 +0000 UTC m=+1470.470910149" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.026484 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.137316 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-bootstrap-combined-ca-bundle\") pod \"03feab36-54ee-4968-a9d9-841cbd059c48\" (UID: \"03feab36-54ee-4968-a9d9-841cbd059c48\") " Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.137396 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-ssh-key\") pod \"03feab36-54ee-4968-a9d9-841cbd059c48\" (UID: \"03feab36-54ee-4968-a9d9-841cbd059c48\") " Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.137439 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8cfj\" (UniqueName: \"kubernetes.io/projected/03feab36-54ee-4968-a9d9-841cbd059c48-kube-api-access-r8cfj\") pod \"03feab36-54ee-4968-a9d9-841cbd059c48\" (UID: \"03feab36-54ee-4968-a9d9-841cbd059c48\") " Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.137569 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-inventory\") pod \"03feab36-54ee-4968-a9d9-841cbd059c48\" (UID: \"03feab36-54ee-4968-a9d9-841cbd059c48\") " Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.147134 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03feab36-54ee-4968-a9d9-841cbd059c48-kube-api-access-r8cfj" (OuterVolumeSpecName: "kube-api-access-r8cfj") pod "03feab36-54ee-4968-a9d9-841cbd059c48" (UID: "03feab36-54ee-4968-a9d9-841cbd059c48"). InnerVolumeSpecName "kube-api-access-r8cfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.148246 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "03feab36-54ee-4968-a9d9-841cbd059c48" (UID: "03feab36-54ee-4968-a9d9-841cbd059c48"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.177329 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-inventory" (OuterVolumeSpecName: "inventory") pod "03feab36-54ee-4968-a9d9-841cbd059c48" (UID: "03feab36-54ee-4968-a9d9-841cbd059c48"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.180246 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "03feab36-54ee-4968-a9d9-841cbd059c48" (UID: "03feab36-54ee-4968-a9d9-841cbd059c48"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.239331 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.239365 4753 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.239377 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03feab36-54ee-4968-a9d9-841cbd059c48-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.239387 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8cfj\" (UniqueName: \"kubernetes.io/projected/03feab36-54ee-4968-a9d9-841cbd059c48-kube-api-access-r8cfj\") on node \"crc\" DevicePath \"\"" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.578574 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" event={"ID":"03feab36-54ee-4968-a9d9-841cbd059c48","Type":"ContainerDied","Data":"2d9e090eb20ff56e52c73e28a142ff8e77d22d12088c2ff92ff74c649a600a15"} Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.578909 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d9e090eb20ff56e52c73e28a142ff8e77d22d12088c2ff92ff74c649a600a15" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.578616 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.679187 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2"] Oct 05 20:39:23 crc kubenswrapper[4753]: E1005 20:39:23.679635 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03feab36-54ee-4968-a9d9-841cbd059c48" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.679657 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="03feab36-54ee-4968-a9d9-841cbd059c48" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.679884 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="03feab36-54ee-4968-a9d9-841cbd059c48" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.680585 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.682178 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.682632 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.682742 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.685424 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.699382 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2"] Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.849682 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h565\" (UniqueName: \"kubernetes.io/projected/0d82b913-d184-4394-8ec8-bc43006d5c38-kube-api-access-5h565\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2\" (UID: \"0d82b913-d184-4394-8ec8-bc43006d5c38\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.850075 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d82b913-d184-4394-8ec8-bc43006d5c38-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2\" (UID: \"0d82b913-d184-4394-8ec8-bc43006d5c38\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.850342 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d82b913-d184-4394-8ec8-bc43006d5c38-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2\" (UID: \"0d82b913-d184-4394-8ec8-bc43006d5c38\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.951458 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h565\" (UniqueName: \"kubernetes.io/projected/0d82b913-d184-4394-8ec8-bc43006d5c38-kube-api-access-5h565\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2\" (UID: \"0d82b913-d184-4394-8ec8-bc43006d5c38\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.951508 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d82b913-d184-4394-8ec8-bc43006d5c38-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2\" (UID: \"0d82b913-d184-4394-8ec8-bc43006d5c38\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.951594 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d82b913-d184-4394-8ec8-bc43006d5c38-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2\" (UID: \"0d82b913-d184-4394-8ec8-bc43006d5c38\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.960401 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d82b913-d184-4394-8ec8-bc43006d5c38-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2\" (UID: \"0d82b913-d184-4394-8ec8-bc43006d5c38\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.961792 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d82b913-d184-4394-8ec8-bc43006d5c38-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2\" (UID: \"0d82b913-d184-4394-8ec8-bc43006d5c38\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.968323 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h565\" (UniqueName: \"kubernetes.io/projected/0d82b913-d184-4394-8ec8-bc43006d5c38-kube-api-access-5h565\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2\" (UID: \"0d82b913-d184-4394-8ec8-bc43006d5c38\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" Oct 05 20:39:23 crc kubenswrapper[4753]: I1005 20:39:23.998398 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" Oct 05 20:39:24 crc kubenswrapper[4753]: W1005 20:39:24.503164 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d82b913_d184_4394_8ec8_bc43006d5c38.slice/crio-e13e69f89dbc5dd055aa124319b0250076dbfc2431e0c8ce465ed3ace778238e WatchSource:0}: Error finding container e13e69f89dbc5dd055aa124319b0250076dbfc2431e0c8ce465ed3ace778238e: Status 404 returned error can't find the container with id e13e69f89dbc5dd055aa124319b0250076dbfc2431e0c8ce465ed3ace778238e Oct 05 20:39:24 crc kubenswrapper[4753]: I1005 20:39:24.521626 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2"] Oct 05 20:39:24 crc kubenswrapper[4753]: I1005 20:39:24.588450 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" event={"ID":"0d82b913-d184-4394-8ec8-bc43006d5c38","Type":"ContainerStarted","Data":"e13e69f89dbc5dd055aa124319b0250076dbfc2431e0c8ce465ed3ace778238e"} Oct 05 20:39:25 crc kubenswrapper[4753]: I1005 20:39:25.596452 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" event={"ID":"0d82b913-d184-4394-8ec8-bc43006d5c38","Type":"ContainerStarted","Data":"ac40edaaf4c147bed35dc4d220690835d3a96a7ff30069626b8fe66a3f6b5c3f"} Oct 05 20:39:25 crc kubenswrapper[4753]: I1005 20:39:25.614610 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" podStartSLOduration=2.097645721 podStartE2EDuration="2.614585926s" podCreationTimestamp="2025-10-05 20:39:23 +0000 UTC" firstStartedPulling="2025-10-05 20:39:24.505299747 +0000 UTC m=+1473.353627979" lastFinishedPulling="2025-10-05 20:39:25.022239952 +0000 UTC m=+1473.870568184" observedRunningTime="2025-10-05 20:39:25.608485726 +0000 UTC m=+1474.456813958" watchObservedRunningTime="2025-10-05 20:39:25.614585926 +0000 UTC m=+1474.462914168" Oct 05 20:39:25 crc kubenswrapper[4753]: I1005 20:39:25.845730 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:25 crc kubenswrapper[4753]: I1005 20:39:25.845775 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:25 crc kubenswrapper[4753]: I1005 20:39:25.894839 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:26 crc kubenswrapper[4753]: I1005 20:39:26.659467 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:26 crc kubenswrapper[4753]: I1005 20:39:26.710122 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gqdbb"] Oct 05 20:39:28 crc kubenswrapper[4753]: I1005 20:39:28.617735 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gqdbb" podUID="08c53741-7d89-4945-b9c8-4710e743a473" containerName="registry-server" containerID="cri-o://82af7688a0ba9f175f6a35ee3ef1e83ce0a25ecbbc6f0f87f69b3cef188fa584" gracePeriod=2 Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.051244 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.249413 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08c53741-7d89-4945-b9c8-4710e743a473-utilities\") pod \"08c53741-7d89-4945-b9c8-4710e743a473\" (UID: \"08c53741-7d89-4945-b9c8-4710e743a473\") " Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.249512 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j48cn\" (UniqueName: \"kubernetes.io/projected/08c53741-7d89-4945-b9c8-4710e743a473-kube-api-access-j48cn\") pod \"08c53741-7d89-4945-b9c8-4710e743a473\" (UID: \"08c53741-7d89-4945-b9c8-4710e743a473\") " Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.249639 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08c53741-7d89-4945-b9c8-4710e743a473-catalog-content\") pod \"08c53741-7d89-4945-b9c8-4710e743a473\" (UID: \"08c53741-7d89-4945-b9c8-4710e743a473\") " Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.250827 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c53741-7d89-4945-b9c8-4710e743a473-utilities" (OuterVolumeSpecName: "utilities") pod "08c53741-7d89-4945-b9c8-4710e743a473" (UID: "08c53741-7d89-4945-b9c8-4710e743a473"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.254470 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c53741-7d89-4945-b9c8-4710e743a473-kube-api-access-j48cn" (OuterVolumeSpecName: "kube-api-access-j48cn") pod "08c53741-7d89-4945-b9c8-4710e743a473" (UID: "08c53741-7d89-4945-b9c8-4710e743a473"). InnerVolumeSpecName "kube-api-access-j48cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.303974 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c53741-7d89-4945-b9c8-4710e743a473-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08c53741-7d89-4945-b9c8-4710e743a473" (UID: "08c53741-7d89-4945-b9c8-4710e743a473"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.352323 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08c53741-7d89-4945-b9c8-4710e743a473-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.352362 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j48cn\" (UniqueName: \"kubernetes.io/projected/08c53741-7d89-4945-b9c8-4710e743a473-kube-api-access-j48cn\") on node \"crc\" DevicePath \"\"" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.352373 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08c53741-7d89-4945-b9c8-4710e743a473-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.647030 4753 generic.go:334] "Generic (PLEG): container finished" podID="08c53741-7d89-4945-b9c8-4710e743a473" containerID="82af7688a0ba9f175f6a35ee3ef1e83ce0a25ecbbc6f0f87f69b3cef188fa584" exitCode=0 Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.647193 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqdbb" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.647193 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqdbb" event={"ID":"08c53741-7d89-4945-b9c8-4710e743a473","Type":"ContainerDied","Data":"82af7688a0ba9f175f6a35ee3ef1e83ce0a25ecbbc6f0f87f69b3cef188fa584"} Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.647474 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqdbb" event={"ID":"08c53741-7d89-4945-b9c8-4710e743a473","Type":"ContainerDied","Data":"496236ded9202939569071e4d270af012b8a2238f8dd99f1d0669f769eaa8b23"} Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.647491 4753 scope.go:117] "RemoveContainer" containerID="82af7688a0ba9f175f6a35ee3ef1e83ce0a25ecbbc6f0f87f69b3cef188fa584" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.680414 4753 scope.go:117] "RemoveContainer" containerID="e221039b42542a8f7496df8af6de7434248bb50a53f362bb7b47951cc1d65159" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.680922 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gqdbb"] Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.690389 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gqdbb"] Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.716916 4753 scope.go:117] "RemoveContainer" containerID="bf12ca307235ab07bd1538ee77497ded7022925067885890193e69ccd50230f6" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.754269 4753 scope.go:117] "RemoveContainer" containerID="82af7688a0ba9f175f6a35ee3ef1e83ce0a25ecbbc6f0f87f69b3cef188fa584" Oct 05 20:39:29 crc kubenswrapper[4753]: E1005 20:39:29.754923 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82af7688a0ba9f175f6a35ee3ef1e83ce0a25ecbbc6f0f87f69b3cef188fa584\": container with ID starting with 82af7688a0ba9f175f6a35ee3ef1e83ce0a25ecbbc6f0f87f69b3cef188fa584 not found: ID does not exist" containerID="82af7688a0ba9f175f6a35ee3ef1e83ce0a25ecbbc6f0f87f69b3cef188fa584" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.754974 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82af7688a0ba9f175f6a35ee3ef1e83ce0a25ecbbc6f0f87f69b3cef188fa584"} err="failed to get container status \"82af7688a0ba9f175f6a35ee3ef1e83ce0a25ecbbc6f0f87f69b3cef188fa584\": rpc error: code = NotFound desc = could not find container \"82af7688a0ba9f175f6a35ee3ef1e83ce0a25ecbbc6f0f87f69b3cef188fa584\": container with ID starting with 82af7688a0ba9f175f6a35ee3ef1e83ce0a25ecbbc6f0f87f69b3cef188fa584 not found: ID does not exist" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.755007 4753 scope.go:117] "RemoveContainer" containerID="e221039b42542a8f7496df8af6de7434248bb50a53f362bb7b47951cc1d65159" Oct 05 20:39:29 crc kubenswrapper[4753]: E1005 20:39:29.755994 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e221039b42542a8f7496df8af6de7434248bb50a53f362bb7b47951cc1d65159\": container with ID starting with e221039b42542a8f7496df8af6de7434248bb50a53f362bb7b47951cc1d65159 not found: ID does not exist" containerID="e221039b42542a8f7496df8af6de7434248bb50a53f362bb7b47951cc1d65159" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.756048 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e221039b42542a8f7496df8af6de7434248bb50a53f362bb7b47951cc1d65159"} err="failed to get container status \"e221039b42542a8f7496df8af6de7434248bb50a53f362bb7b47951cc1d65159\": rpc error: code = NotFound desc = could not find container \"e221039b42542a8f7496df8af6de7434248bb50a53f362bb7b47951cc1d65159\": container with ID starting with e221039b42542a8f7496df8af6de7434248bb50a53f362bb7b47951cc1d65159 not found: ID does not exist" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.756091 4753 scope.go:117] "RemoveContainer" containerID="bf12ca307235ab07bd1538ee77497ded7022925067885890193e69ccd50230f6" Oct 05 20:39:29 crc kubenswrapper[4753]: E1005 20:39:29.756566 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf12ca307235ab07bd1538ee77497ded7022925067885890193e69ccd50230f6\": container with ID starting with bf12ca307235ab07bd1538ee77497ded7022925067885890193e69ccd50230f6 not found: ID does not exist" containerID="bf12ca307235ab07bd1538ee77497ded7022925067885890193e69ccd50230f6" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.756631 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf12ca307235ab07bd1538ee77497ded7022925067885890193e69ccd50230f6"} err="failed to get container status \"bf12ca307235ab07bd1538ee77497ded7022925067885890193e69ccd50230f6\": rpc error: code = NotFound desc = could not find container \"bf12ca307235ab07bd1538ee77497ded7022925067885890193e69ccd50230f6\": container with ID starting with bf12ca307235ab07bd1538ee77497ded7022925067885890193e69ccd50230f6 not found: ID does not exist" Oct 05 20:39:29 crc kubenswrapper[4753]: I1005 20:39:29.864051 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c53741-7d89-4945-b9c8-4710e743a473" path="/var/lib/kubelet/pods/08c53741-7d89-4945-b9c8-4710e743a473/volumes" Oct 05 20:40:01 crc kubenswrapper[4753]: I1005 20:40:01.747778 4753 scope.go:117] "RemoveContainer" containerID="52563c869e0b7d0ff34706d82591063a3c61749c4a619ef0e1a87282cf4b0c13" Oct 05 20:40:01 crc kubenswrapper[4753]: I1005 20:40:01.776768 4753 scope.go:117] "RemoveContainer" containerID="d5f5c2d87a339558b8c7d072d0368a65cae97b026863695b0b745901a330aae2" Oct 05 20:40:01 crc kubenswrapper[4753]: I1005 20:40:01.803882 4753 scope.go:117] "RemoveContainer" containerID="82fb32666c6c51c7575059470f0d7b9771372741626a985ea5becf40d5ce747b" Oct 05 20:40:01 crc kubenswrapper[4753]: I1005 20:40:01.833656 4753 scope.go:117] "RemoveContainer" containerID="795c0827726e33acf2c2701ce3fe485a5f0ecfe0baeeb402e25cd3b0905405d7" Oct 05 20:40:04 crc kubenswrapper[4753]: I1005 20:40:04.490208 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:40:04 crc kubenswrapper[4753]: I1005 20:40:04.491310 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:40:24 crc kubenswrapper[4753]: I1005 20:40:24.060467 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-ps278"] Oct 05 20:40:24 crc kubenswrapper[4753]: I1005 20:40:24.070352 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-ps278"] Oct 05 20:40:25 crc kubenswrapper[4753]: I1005 20:40:25.869354 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5bca284-a880-4ea8-a0fa-74a08ac1f840" path="/var/lib/kubelet/pods/a5bca284-a880-4ea8-a0fa-74a08ac1f840/volumes" Oct 05 20:40:28 crc kubenswrapper[4753]: I1005 20:40:28.025057 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-7h44h"] Oct 05 20:40:28 crc kubenswrapper[4753]: I1005 20:40:28.034758 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-sd4c8"] Oct 05 20:40:28 crc kubenswrapper[4753]: I1005 20:40:28.045920 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-sd4c8"] Oct 05 20:40:28 crc kubenswrapper[4753]: I1005 20:40:28.054631 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-7h44h"] Oct 05 20:40:29 crc kubenswrapper[4753]: I1005 20:40:29.861288 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a13c89-fe4e-42be-88c4-fe420d90f169" path="/var/lib/kubelet/pods/51a13c89-fe4e-42be-88c4-fe420d90f169/volumes" Oct 05 20:40:29 crc kubenswrapper[4753]: I1005 20:40:29.862079 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89d998ff-850f-4cc5-a430-4671c9a7b68a" path="/var/lib/kubelet/pods/89d998ff-850f-4cc5-a430-4671c9a7b68a/volumes" Oct 05 20:40:34 crc kubenswrapper[4753]: I1005 20:40:34.039554 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b795-account-create-6hggf"] Oct 05 20:40:34 crc kubenswrapper[4753]: I1005 20:40:34.047455 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b795-account-create-6hggf"] Oct 05 20:40:34 crc kubenswrapper[4753]: I1005 20:40:34.490757 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:40:34 crc kubenswrapper[4753]: I1005 20:40:34.490878 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:40:35 crc kubenswrapper[4753]: I1005 20:40:35.863873 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f079c8d-3df8-41e9-a6f6-fd47d276928c" path="/var/lib/kubelet/pods/7f079c8d-3df8-41e9-a6f6-fd47d276928c/volumes" Oct 05 20:40:38 crc kubenswrapper[4753]: I1005 20:40:38.041852 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b3c3-account-create-9gh57"] Oct 05 20:40:38 crc kubenswrapper[4753]: I1005 20:40:38.056022 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2847-account-create-l4pv6"] Oct 05 20:40:38 crc kubenswrapper[4753]: I1005 20:40:38.064796 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b3c3-account-create-9gh57"] Oct 05 20:40:38 crc kubenswrapper[4753]: I1005 20:40:38.071865 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2847-account-create-l4pv6"] Oct 05 20:40:39 crc kubenswrapper[4753]: I1005 20:40:39.864588 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97041ad3-4fe1-475b-8b49-54c1ccab26d8" path="/var/lib/kubelet/pods/97041ad3-4fe1-475b-8b49-54c1ccab26d8/volumes" Oct 05 20:40:39 crc kubenswrapper[4753]: I1005 20:40:39.866340 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd9a587b-445e-4063-bc04-479ea77c77bc" path="/var/lib/kubelet/pods/fd9a587b-445e-4063-bc04-479ea77c77bc/volumes" Oct 05 20:40:42 crc kubenswrapper[4753]: I1005 20:40:42.316838 4753 generic.go:334] "Generic (PLEG): container finished" podID="0d82b913-d184-4394-8ec8-bc43006d5c38" containerID="ac40edaaf4c147bed35dc4d220690835d3a96a7ff30069626b8fe66a3f6b5c3f" exitCode=0 Oct 05 20:40:42 crc kubenswrapper[4753]: I1005 20:40:42.316962 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" event={"ID":"0d82b913-d184-4394-8ec8-bc43006d5c38","Type":"ContainerDied","Data":"ac40edaaf4c147bed35dc4d220690835d3a96a7ff30069626b8fe66a3f6b5c3f"} Oct 05 20:40:43 crc kubenswrapper[4753]: I1005 20:40:43.762364 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" Oct 05 20:40:43 crc kubenswrapper[4753]: I1005 20:40:43.926797 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d82b913-d184-4394-8ec8-bc43006d5c38-ssh-key\") pod \"0d82b913-d184-4394-8ec8-bc43006d5c38\" (UID: \"0d82b913-d184-4394-8ec8-bc43006d5c38\") " Oct 05 20:40:43 crc kubenswrapper[4753]: I1005 20:40:43.926952 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h565\" (UniqueName: \"kubernetes.io/projected/0d82b913-d184-4394-8ec8-bc43006d5c38-kube-api-access-5h565\") pod \"0d82b913-d184-4394-8ec8-bc43006d5c38\" (UID: \"0d82b913-d184-4394-8ec8-bc43006d5c38\") " Oct 05 20:40:43 crc kubenswrapper[4753]: I1005 20:40:43.927122 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d82b913-d184-4394-8ec8-bc43006d5c38-inventory\") pod \"0d82b913-d184-4394-8ec8-bc43006d5c38\" (UID: \"0d82b913-d184-4394-8ec8-bc43006d5c38\") " Oct 05 20:40:43 crc kubenswrapper[4753]: I1005 20:40:43.932205 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d82b913-d184-4394-8ec8-bc43006d5c38-kube-api-access-5h565" (OuterVolumeSpecName: "kube-api-access-5h565") pod "0d82b913-d184-4394-8ec8-bc43006d5c38" (UID: "0d82b913-d184-4394-8ec8-bc43006d5c38"). InnerVolumeSpecName "kube-api-access-5h565". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:40:43 crc kubenswrapper[4753]: I1005 20:40:43.961707 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d82b913-d184-4394-8ec8-bc43006d5c38-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0d82b913-d184-4394-8ec8-bc43006d5c38" (UID: "0d82b913-d184-4394-8ec8-bc43006d5c38"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:40:43 crc kubenswrapper[4753]: I1005 20:40:43.971353 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d82b913-d184-4394-8ec8-bc43006d5c38-inventory" (OuterVolumeSpecName: "inventory") pod "0d82b913-d184-4394-8ec8-bc43006d5c38" (UID: "0d82b913-d184-4394-8ec8-bc43006d5c38"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.028993 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d82b913-d184-4394-8ec8-bc43006d5c38-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.029024 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d82b913-d184-4394-8ec8-bc43006d5c38-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.029038 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h565\" (UniqueName: \"kubernetes.io/projected/0d82b913-d184-4394-8ec8-bc43006d5c38-kube-api-access-5h565\") on node \"crc\" DevicePath \"\"" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.339338 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" event={"ID":"0d82b913-d184-4394-8ec8-bc43006d5c38","Type":"ContainerDied","Data":"e13e69f89dbc5dd055aa124319b0250076dbfc2431e0c8ce465ed3ace778238e"} Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.339390 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e13e69f89dbc5dd055aa124319b0250076dbfc2431e0c8ce465ed3ace778238e" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.339399 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.421881 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89"] Oct 05 20:40:44 crc kubenswrapper[4753]: E1005 20:40:44.422501 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c53741-7d89-4945-b9c8-4710e743a473" containerName="extract-utilities" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.422602 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c53741-7d89-4945-b9c8-4710e743a473" containerName="extract-utilities" Oct 05 20:40:44 crc kubenswrapper[4753]: E1005 20:40:44.422680 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c53741-7d89-4945-b9c8-4710e743a473" containerName="extract-content" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.422742 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c53741-7d89-4945-b9c8-4710e743a473" containerName="extract-content" Oct 05 20:40:44 crc kubenswrapper[4753]: E1005 20:40:44.422808 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c53741-7d89-4945-b9c8-4710e743a473" containerName="registry-server" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.422875 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c53741-7d89-4945-b9c8-4710e743a473" containerName="registry-server" Oct 05 20:40:44 crc kubenswrapper[4753]: E1005 20:40:44.422956 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d82b913-d184-4394-8ec8-bc43006d5c38" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.423019 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d82b913-d184-4394-8ec8-bc43006d5c38" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.423308 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d82b913-d184-4394-8ec8-bc43006d5c38" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.423407 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c53741-7d89-4945-b9c8-4710e743a473" containerName="registry-server" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.424166 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.428324 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.429397 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.429830 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.437565 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.452294 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89"] Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.550393 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0b6d15a-036b-4a52-b581-c5ebed291529-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8b89\" (UID: \"d0b6d15a-036b-4a52-b581-c5ebed291529\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.550698 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0b6d15a-036b-4a52-b581-c5ebed291529-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8b89\" (UID: \"d0b6d15a-036b-4a52-b581-c5ebed291529\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.550887 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bwgr\" (UniqueName: \"kubernetes.io/projected/d0b6d15a-036b-4a52-b581-c5ebed291529-kube-api-access-6bwgr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8b89\" (UID: \"d0b6d15a-036b-4a52-b581-c5ebed291529\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.652730 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0b6d15a-036b-4a52-b581-c5ebed291529-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8b89\" (UID: \"d0b6d15a-036b-4a52-b581-c5ebed291529\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.652837 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bwgr\" (UniqueName: \"kubernetes.io/projected/d0b6d15a-036b-4a52-b581-c5ebed291529-kube-api-access-6bwgr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8b89\" (UID: \"d0b6d15a-036b-4a52-b581-c5ebed291529\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.652918 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0b6d15a-036b-4a52-b581-c5ebed291529-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8b89\" (UID: \"d0b6d15a-036b-4a52-b581-c5ebed291529\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.657057 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0b6d15a-036b-4a52-b581-c5ebed291529-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8b89\" (UID: \"d0b6d15a-036b-4a52-b581-c5ebed291529\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.658477 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0b6d15a-036b-4a52-b581-c5ebed291529-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8b89\" (UID: \"d0b6d15a-036b-4a52-b581-c5ebed291529\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.678685 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bwgr\" (UniqueName: \"kubernetes.io/projected/d0b6d15a-036b-4a52-b581-c5ebed291529-kube-api-access-6bwgr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8b89\" (UID: \"d0b6d15a-036b-4a52-b581-c5ebed291529\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" Oct 05 20:40:44 crc kubenswrapper[4753]: I1005 20:40:44.747050 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" Oct 05 20:40:45 crc kubenswrapper[4753]: I1005 20:40:45.258285 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 05 20:40:45 crc kubenswrapper[4753]: I1005 20:40:45.258305 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89"] Oct 05 20:40:45 crc kubenswrapper[4753]: I1005 20:40:45.350671 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" event={"ID":"d0b6d15a-036b-4a52-b581-c5ebed291529","Type":"ContainerStarted","Data":"a028e71a5f1055ecc1d85bb50fe74c4d518c6dca066ce7b13dd407d5cf45ef76"} Oct 05 20:40:46 crc kubenswrapper[4753]: I1005 20:40:46.361663 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" event={"ID":"d0b6d15a-036b-4a52-b581-c5ebed291529","Type":"ContainerStarted","Data":"bb531e045fb32d1ccb08ce22dd51084abb98705bf211ecea214b83c57141430b"} Oct 05 20:40:46 crc kubenswrapper[4753]: I1005 20:40:46.386510 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" podStartSLOduration=1.940550695 podStartE2EDuration="2.386490908s" podCreationTimestamp="2025-10-05 20:40:44 +0000 UTC" firstStartedPulling="2025-10-05 20:40:45.257960329 +0000 UTC m=+1554.106288571" lastFinishedPulling="2025-10-05 20:40:45.703900552 +0000 UTC m=+1554.552228784" observedRunningTime="2025-10-05 20:40:46.382637097 +0000 UTC m=+1555.230965339" watchObservedRunningTime="2025-10-05 20:40:46.386490908 +0000 UTC m=+1555.234819150" Oct 05 20:40:52 crc kubenswrapper[4753]: I1005 20:40:52.440795 4753 generic.go:334] "Generic (PLEG): container finished" podID="d0b6d15a-036b-4a52-b581-c5ebed291529" containerID="bb531e045fb32d1ccb08ce22dd51084abb98705bf211ecea214b83c57141430b" exitCode=0 Oct 05 20:40:52 crc kubenswrapper[4753]: I1005 20:40:52.440956 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" event={"ID":"d0b6d15a-036b-4a52-b581-c5ebed291529","Type":"ContainerDied","Data":"bb531e045fb32d1ccb08ce22dd51084abb98705bf211ecea214b83c57141430b"} Oct 05 20:40:53 crc kubenswrapper[4753]: I1005 20:40:53.043982 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-8mgfs"] Oct 05 20:40:53 crc kubenswrapper[4753]: I1005 20:40:53.057294 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-zbs5q"] Oct 05 20:40:53 crc kubenswrapper[4753]: I1005 20:40:53.064883 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-jwnks"] Oct 05 20:40:53 crc kubenswrapper[4753]: I1005 20:40:53.071126 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-zbs5q"] Oct 05 20:40:53 crc kubenswrapper[4753]: I1005 20:40:53.078502 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-8mgfs"] Oct 05 20:40:53 crc kubenswrapper[4753]: I1005 20:40:53.085090 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-jwnks"] Oct 05 20:40:53 crc kubenswrapper[4753]: I1005 20:40:53.863338 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="389992bc-e15e-444e-b7ca-7f7212ad86d0" path="/var/lib/kubelet/pods/389992bc-e15e-444e-b7ca-7f7212ad86d0/volumes" Oct 05 20:40:53 crc kubenswrapper[4753]: I1005 20:40:53.869291 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bc975d7-0761-473f-8fa7-d9f0980ac9bc" path="/var/lib/kubelet/pods/8bc975d7-0761-473f-8fa7-d9f0980ac9bc/volumes" Oct 05 20:40:53 crc kubenswrapper[4753]: I1005 20:40:53.870569 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6" path="/var/lib/kubelet/pods/9e3e2cf1-c36b-4258-a034-4c0a9c5e16e6/volumes" Oct 05 20:40:53 crc kubenswrapper[4753]: I1005 20:40:53.882206 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.018571 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bwgr\" (UniqueName: \"kubernetes.io/projected/d0b6d15a-036b-4a52-b581-c5ebed291529-kube-api-access-6bwgr\") pod \"d0b6d15a-036b-4a52-b581-c5ebed291529\" (UID: \"d0b6d15a-036b-4a52-b581-c5ebed291529\") " Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.019298 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0b6d15a-036b-4a52-b581-c5ebed291529-inventory\") pod \"d0b6d15a-036b-4a52-b581-c5ebed291529\" (UID: \"d0b6d15a-036b-4a52-b581-c5ebed291529\") " Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.019501 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0b6d15a-036b-4a52-b581-c5ebed291529-ssh-key\") pod \"d0b6d15a-036b-4a52-b581-c5ebed291529\" (UID: \"d0b6d15a-036b-4a52-b581-c5ebed291529\") " Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.038177 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0b6d15a-036b-4a52-b581-c5ebed291529-kube-api-access-6bwgr" (OuterVolumeSpecName: "kube-api-access-6bwgr") pod "d0b6d15a-036b-4a52-b581-c5ebed291529" (UID: "d0b6d15a-036b-4a52-b581-c5ebed291529"). InnerVolumeSpecName "kube-api-access-6bwgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.046557 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b6d15a-036b-4a52-b581-c5ebed291529-inventory" (OuterVolumeSpecName: "inventory") pod "d0b6d15a-036b-4a52-b581-c5ebed291529" (UID: "d0b6d15a-036b-4a52-b581-c5ebed291529"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.049880 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b6d15a-036b-4a52-b581-c5ebed291529-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d0b6d15a-036b-4a52-b581-c5ebed291529" (UID: "d0b6d15a-036b-4a52-b581-c5ebed291529"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.122644 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bwgr\" (UniqueName: \"kubernetes.io/projected/d0b6d15a-036b-4a52-b581-c5ebed291529-kube-api-access-6bwgr\") on node \"crc\" DevicePath \"\"" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.122678 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0b6d15a-036b-4a52-b581-c5ebed291529-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.122687 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0b6d15a-036b-4a52-b581-c5ebed291529-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.464361 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" event={"ID":"d0b6d15a-036b-4a52-b581-c5ebed291529","Type":"ContainerDied","Data":"a028e71a5f1055ecc1d85bb50fe74c4d518c6dca066ce7b13dd407d5cf45ef76"} Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.464413 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a028e71a5f1055ecc1d85bb50fe74c4d518c6dca066ce7b13dd407d5cf45ef76" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.464857 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.533028 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s"] Oct 05 20:40:54 crc kubenswrapper[4753]: E1005 20:40:54.533464 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b6d15a-036b-4a52-b581-c5ebed291529" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.533485 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b6d15a-036b-4a52-b581-c5ebed291529" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.533650 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b6d15a-036b-4a52-b581-c5ebed291529" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.534248 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.537130 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.537845 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.538050 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.539396 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.600791 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s"] Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.632555 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-br66s\" (UID: \"7a01b0b7-7dab-4020-811f-55c4d0f5b45d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.633028 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb9wk\" (UniqueName: \"kubernetes.io/projected/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-kube-api-access-qb9wk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-br66s\" (UID: \"7a01b0b7-7dab-4020-811f-55c4d0f5b45d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.633167 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-br66s\" (UID: \"7a01b0b7-7dab-4020-811f-55c4d0f5b45d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.735591 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb9wk\" (UniqueName: \"kubernetes.io/projected/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-kube-api-access-qb9wk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-br66s\" (UID: \"7a01b0b7-7dab-4020-811f-55c4d0f5b45d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.735653 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-br66s\" (UID: \"7a01b0b7-7dab-4020-811f-55c4d0f5b45d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.735762 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-br66s\" (UID: \"7a01b0b7-7dab-4020-811f-55c4d0f5b45d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.746848 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-br66s\" (UID: \"7a01b0b7-7dab-4020-811f-55c4d0f5b45d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.747899 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-br66s\" (UID: \"7a01b0b7-7dab-4020-811f-55c4d0f5b45d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.763783 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb9wk\" (UniqueName: \"kubernetes.io/projected/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-kube-api-access-qb9wk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-br66s\" (UID: \"7a01b0b7-7dab-4020-811f-55c4d0f5b45d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" Oct 05 20:40:54 crc kubenswrapper[4753]: I1005 20:40:54.860683 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" Oct 05 20:40:55 crc kubenswrapper[4753]: I1005 20:40:55.408038 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s"] Oct 05 20:40:55 crc kubenswrapper[4753]: I1005 20:40:55.473573 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" event={"ID":"7a01b0b7-7dab-4020-811f-55c4d0f5b45d","Type":"ContainerStarted","Data":"e6345aebf4b1923b2a566d55893540ecb7a197b9d3a094297a32ec0c0c61b20d"} Oct 05 20:40:56 crc kubenswrapper[4753]: I1005 20:40:56.481372 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" event={"ID":"7a01b0b7-7dab-4020-811f-55c4d0f5b45d","Type":"ContainerStarted","Data":"423a7d70191a6277b2e62814df0cc1435d907cc333dbbba7fded356bcbe254c5"} Oct 05 20:40:56 crc kubenswrapper[4753]: I1005 20:40:56.503863 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" podStartSLOduration=2.051718162 podStartE2EDuration="2.503846501s" podCreationTimestamp="2025-10-05 20:40:54 +0000 UTC" firstStartedPulling="2025-10-05 20:40:55.425414083 +0000 UTC m=+1564.273742325" lastFinishedPulling="2025-10-05 20:40:55.877542422 +0000 UTC m=+1564.725870664" observedRunningTime="2025-10-05 20:40:56.49682621 +0000 UTC m=+1565.345154442" watchObservedRunningTime="2025-10-05 20:40:56.503846501 +0000 UTC m=+1565.352174733" Oct 05 20:41:00 crc kubenswrapper[4753]: I1005 20:41:00.034660 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-mlx4p"] Oct 05 20:41:00 crc kubenswrapper[4753]: I1005 20:41:00.045244 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-mlx4p"] Oct 05 20:41:01 crc kubenswrapper[4753]: I1005 20:41:01.034743 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-ncbw7"] Oct 05 20:41:01 crc kubenswrapper[4753]: I1005 20:41:01.042481 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-ncbw7"] Oct 05 20:41:01 crc kubenswrapper[4753]: I1005 20:41:01.914437 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0521eb28-0c37-447f-a80c-60e5187098a5" path="/var/lib/kubelet/pods/0521eb28-0c37-447f-a80c-60e5187098a5/volumes" Oct 05 20:41:01 crc kubenswrapper[4753]: I1005 20:41:01.915781 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ab65a8e-6432-45b7-a2ed-c252ebbe60ba" path="/var/lib/kubelet/pods/1ab65a8e-6432-45b7-a2ed-c252ebbe60ba/volumes" Oct 05 20:41:01 crc kubenswrapper[4753]: I1005 20:41:01.943981 4753 scope.go:117] "RemoveContainer" containerID="f5559f50204feea7cdc73ca6185cb0db5f58e42fb098fc76c6812f6e426a5bd4" Oct 05 20:41:01 crc kubenswrapper[4753]: I1005 20:41:01.969894 4753 scope.go:117] "RemoveContainer" containerID="704dde8ea9c6a0cc23bc899d20b63c43f6171e917e1570b4bf6e3dd041497fa8" Oct 05 20:41:02 crc kubenswrapper[4753]: I1005 20:41:02.022294 4753 scope.go:117] "RemoveContainer" containerID="617ff2b16eef7f0782a945c8a904bf55feba537c39ada4f52a808e644ca179fe" Oct 05 20:41:02 crc kubenswrapper[4753]: I1005 20:41:02.065975 4753 scope.go:117] "RemoveContainer" containerID="f6db53208919ef56c84b301868dcc0be286ca72b29dbd8d81691ad792a187b0e" Oct 05 20:41:02 crc kubenswrapper[4753]: I1005 20:41:02.112444 4753 scope.go:117] "RemoveContainer" containerID="bf3403508f00872b209bead25e80f081623e3c861ed59fd3cb93ae2177567de6" Oct 05 20:41:02 crc kubenswrapper[4753]: I1005 20:41:02.152587 4753 scope.go:117] "RemoveContainer" containerID="4f900490c738096c748cb64d00deb888c83e0facbd08c0e7fcf8ce224e1b3553" Oct 05 20:41:02 crc kubenswrapper[4753]: I1005 20:41:02.180422 4753 scope.go:117] "RemoveContainer" containerID="f4fd978be212d0546236c9f5f046acd92e193fb91e687e33a566904b6bf49508" Oct 05 20:41:02 crc kubenswrapper[4753]: I1005 20:41:02.210351 4753 scope.go:117] "RemoveContainer" containerID="64d6643fe6b64a1c4fe1ba8be4cc6a9e0d2abb9890e449bddb75e47717e8e554" Oct 05 20:41:02 crc kubenswrapper[4753]: I1005 20:41:02.241867 4753 scope.go:117] "RemoveContainer" containerID="a5bc681ae021c2271b974c44eaf1a85451c716bbca34287851922de276ce87a0" Oct 05 20:41:02 crc kubenswrapper[4753]: I1005 20:41:02.261450 4753 scope.go:117] "RemoveContainer" containerID="4e289b5008df17d6e9d98804a7f7af075fa15472ce1e33968ea7c587cf5f51ef" Oct 05 20:41:02 crc kubenswrapper[4753]: I1005 20:41:02.279848 4753 scope.go:117] "RemoveContainer" containerID="98c5303f9c6aff00c19f4b5d3fe957abbf04f336cd37531b2f48be5b18d81df3" Oct 05 20:41:04 crc kubenswrapper[4753]: I1005 20:41:04.489561 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:41:04 crc kubenswrapper[4753]: I1005 20:41:04.489615 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:41:04 crc kubenswrapper[4753]: I1005 20:41:04.489663 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:41:04 crc kubenswrapper[4753]: I1005 20:41:04.490439 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 20:41:04 crc kubenswrapper[4753]: I1005 20:41:04.490499 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" gracePeriod=600 Oct 05 20:41:04 crc kubenswrapper[4753]: E1005 20:41:04.620928 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:41:05 crc kubenswrapper[4753]: I1005 20:41:05.573628 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" exitCode=0 Oct 05 20:41:05 crc kubenswrapper[4753]: I1005 20:41:05.573708 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a"} Oct 05 20:41:05 crc kubenswrapper[4753]: I1005 20:41:05.574048 4753 scope.go:117] "RemoveContainer" containerID="9030d6a687aa30375abc82b21b6ddaee627312bc9d654f1ab51991af41da5fd6" Oct 05 20:41:05 crc kubenswrapper[4753]: I1005 20:41:05.574797 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:41:05 crc kubenswrapper[4753]: E1005 20:41:05.575075 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:41:09 crc kubenswrapper[4753]: I1005 20:41:09.027222 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7ccf-account-create-874gd"] Oct 05 20:41:09 crc kubenswrapper[4753]: I1005 20:41:09.039591 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7ccf-account-create-874gd"] Oct 05 20:41:09 crc kubenswrapper[4753]: I1005 20:41:09.862370 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="189f9e06-87f8-47ee-83be-6e479c93ec63" path="/var/lib/kubelet/pods/189f9e06-87f8-47ee-83be-6e479c93ec63/volumes" Oct 05 20:41:13 crc kubenswrapper[4753]: I1005 20:41:13.045007 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b8c4-account-create-9bhss"] Oct 05 20:41:13 crc kubenswrapper[4753]: I1005 20:41:13.053070 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1002-account-create-wctbs"] Oct 05 20:41:13 crc kubenswrapper[4753]: I1005 20:41:13.064767 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b8c4-account-create-9bhss"] Oct 05 20:41:13 crc kubenswrapper[4753]: I1005 20:41:13.072352 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1002-account-create-wctbs"] Oct 05 20:41:13 crc kubenswrapper[4753]: I1005 20:41:13.864720 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60a64254-c591-486d-bd24-c1fc63bdb561" path="/var/lib/kubelet/pods/60a64254-c591-486d-bd24-c1fc63bdb561/volumes" Oct 05 20:41:13 crc kubenswrapper[4753]: I1005 20:41:13.865611 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fef0457b-e235-4baf-8095-736b17c17fd7" path="/var/lib/kubelet/pods/fef0457b-e235-4baf-8095-736b17c17fd7/volumes" Oct 05 20:41:18 crc kubenswrapper[4753]: I1005 20:41:18.030090 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-flzmg"] Oct 05 20:41:18 crc kubenswrapper[4753]: I1005 20:41:18.038021 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-flzmg"] Oct 05 20:41:19 crc kubenswrapper[4753]: I1005 20:41:19.862334 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3154b1ca-ea81-4fc8-ba7d-ff439e97c930" path="/var/lib/kubelet/pods/3154b1ca-ea81-4fc8-ba7d-ff439e97c930/volumes" Oct 05 20:41:20 crc kubenswrapper[4753]: I1005 20:41:20.853091 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:41:20 crc kubenswrapper[4753]: E1005 20:41:20.853845 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:41:23 crc kubenswrapper[4753]: I1005 20:41:23.028764 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7btxb"] Oct 05 20:41:23 crc kubenswrapper[4753]: I1005 20:41:23.036546 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7btxb"] Oct 05 20:41:23 crc kubenswrapper[4753]: I1005 20:41:23.863801 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04bd2ecd-c468-4efa-877b-983c27dde353" path="/var/lib/kubelet/pods/04bd2ecd-c468-4efa-877b-983c27dde353/volumes" Oct 05 20:41:33 crc kubenswrapper[4753]: I1005 20:41:33.852983 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:41:33 crc kubenswrapper[4753]: E1005 20:41:33.853666 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:41:36 crc kubenswrapper[4753]: I1005 20:41:36.029598 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-w27gb"] Oct 05 20:41:36 crc kubenswrapper[4753]: I1005 20:41:36.035935 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-w27gb"] Oct 05 20:41:37 crc kubenswrapper[4753]: I1005 20:41:37.865543 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c440ae46-1143-44a6-8443-1b4c27fda1d1" path="/var/lib/kubelet/pods/c440ae46-1143-44a6-8443-1b4c27fda1d1/volumes" Oct 05 20:41:40 crc kubenswrapper[4753]: I1005 20:41:40.904645 4753 generic.go:334] "Generic (PLEG): container finished" podID="7a01b0b7-7dab-4020-811f-55c4d0f5b45d" containerID="423a7d70191a6277b2e62814df0cc1435d907cc333dbbba7fded356bcbe254c5" exitCode=0 Oct 05 20:41:40 crc kubenswrapper[4753]: I1005 20:41:40.904718 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" event={"ID":"7a01b0b7-7dab-4020-811f-55c4d0f5b45d","Type":"ContainerDied","Data":"423a7d70191a6277b2e62814df0cc1435d907cc333dbbba7fded356bcbe254c5"} Oct 05 20:41:42 crc kubenswrapper[4753]: I1005 20:41:42.323010 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" Oct 05 20:41:42 crc kubenswrapper[4753]: I1005 20:41:42.469531 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-inventory\") pod \"7a01b0b7-7dab-4020-811f-55c4d0f5b45d\" (UID: \"7a01b0b7-7dab-4020-811f-55c4d0f5b45d\") " Oct 05 20:41:42 crc kubenswrapper[4753]: I1005 20:41:42.469748 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-ssh-key\") pod \"7a01b0b7-7dab-4020-811f-55c4d0f5b45d\" (UID: \"7a01b0b7-7dab-4020-811f-55c4d0f5b45d\") " Oct 05 20:41:42 crc kubenswrapper[4753]: I1005 20:41:42.469797 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb9wk\" (UniqueName: \"kubernetes.io/projected/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-kube-api-access-qb9wk\") pod \"7a01b0b7-7dab-4020-811f-55c4d0f5b45d\" (UID: \"7a01b0b7-7dab-4020-811f-55c4d0f5b45d\") " Oct 05 20:41:42 crc kubenswrapper[4753]: I1005 20:41:42.476278 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-kube-api-access-qb9wk" (OuterVolumeSpecName: "kube-api-access-qb9wk") pod "7a01b0b7-7dab-4020-811f-55c4d0f5b45d" (UID: "7a01b0b7-7dab-4020-811f-55c4d0f5b45d"). InnerVolumeSpecName "kube-api-access-qb9wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:41:42 crc kubenswrapper[4753]: I1005 20:41:42.503396 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7a01b0b7-7dab-4020-811f-55c4d0f5b45d" (UID: "7a01b0b7-7dab-4020-811f-55c4d0f5b45d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:41:42 crc kubenswrapper[4753]: I1005 20:41:42.503914 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-inventory" (OuterVolumeSpecName: "inventory") pod "7a01b0b7-7dab-4020-811f-55c4d0f5b45d" (UID: "7a01b0b7-7dab-4020-811f-55c4d0f5b45d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:41:42 crc kubenswrapper[4753]: I1005 20:41:42.572253 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:41:42 crc kubenswrapper[4753]: I1005 20:41:42.572678 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:41:42 crc kubenswrapper[4753]: I1005 20:41:42.572695 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb9wk\" (UniqueName: \"kubernetes.io/projected/7a01b0b7-7dab-4020-811f-55c4d0f5b45d-kube-api-access-qb9wk\") on node \"crc\" DevicePath \"\"" Oct 05 20:41:42 crc kubenswrapper[4753]: I1005 20:41:42.929065 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" event={"ID":"7a01b0b7-7dab-4020-811f-55c4d0f5b45d","Type":"ContainerDied","Data":"e6345aebf4b1923b2a566d55893540ecb7a197b9d3a094297a32ec0c0c61b20d"} Oct 05 20:41:42 crc kubenswrapper[4753]: I1005 20:41:42.929104 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6345aebf4b1923b2a566d55893540ecb7a197b9d3a094297a32ec0c0c61b20d" Oct 05 20:41:42 crc kubenswrapper[4753]: I1005 20:41:42.929171 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.003820 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn"] Oct 05 20:41:43 crc kubenswrapper[4753]: E1005 20:41:43.004252 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a01b0b7-7dab-4020-811f-55c4d0f5b45d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.004272 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a01b0b7-7dab-4020-811f-55c4d0f5b45d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.004498 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a01b0b7-7dab-4020-811f-55c4d0f5b45d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.005206 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.007801 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.008339 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.008339 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.008405 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.021447 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn"] Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.088723 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76r9b\" (UniqueName: \"kubernetes.io/projected/63b10f94-2d2e-402a-8f62-92c7e33eef92-kube-api-access-76r9b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn\" (UID: \"63b10f94-2d2e-402a-8f62-92c7e33eef92\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.088833 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63b10f94-2d2e-402a-8f62-92c7e33eef92-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn\" (UID: \"63b10f94-2d2e-402a-8f62-92c7e33eef92\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.088920 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63b10f94-2d2e-402a-8f62-92c7e33eef92-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn\" (UID: \"63b10f94-2d2e-402a-8f62-92c7e33eef92\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.189961 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63b10f94-2d2e-402a-8f62-92c7e33eef92-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn\" (UID: \"63b10f94-2d2e-402a-8f62-92c7e33eef92\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.190053 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76r9b\" (UniqueName: \"kubernetes.io/projected/63b10f94-2d2e-402a-8f62-92c7e33eef92-kube-api-access-76r9b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn\" (UID: \"63b10f94-2d2e-402a-8f62-92c7e33eef92\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.190099 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63b10f94-2d2e-402a-8f62-92c7e33eef92-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn\" (UID: \"63b10f94-2d2e-402a-8f62-92c7e33eef92\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.194222 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63b10f94-2d2e-402a-8f62-92c7e33eef92-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn\" (UID: \"63b10f94-2d2e-402a-8f62-92c7e33eef92\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.195390 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63b10f94-2d2e-402a-8f62-92c7e33eef92-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn\" (UID: \"63b10f94-2d2e-402a-8f62-92c7e33eef92\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.209076 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76r9b\" (UniqueName: \"kubernetes.io/projected/63b10f94-2d2e-402a-8f62-92c7e33eef92-kube-api-access-76r9b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn\" (UID: \"63b10f94-2d2e-402a-8f62-92c7e33eef92\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.364714 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.897189 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn"] Oct 05 20:41:43 crc kubenswrapper[4753]: I1005 20:41:43.937347 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" event={"ID":"63b10f94-2d2e-402a-8f62-92c7e33eef92","Type":"ContainerStarted","Data":"acb5f08449e31d706456802bca2e4473706cbee6048b619b8a2d0f72a407343a"} Oct 05 20:41:44 crc kubenswrapper[4753]: I1005 20:41:44.948523 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" event={"ID":"63b10f94-2d2e-402a-8f62-92c7e33eef92","Type":"ContainerStarted","Data":"c56cea440ef60fbb4b698d0d8bd8acd986b98357fe70706b0c44084e495e58fc"} Oct 05 20:41:44 crc kubenswrapper[4753]: I1005 20:41:44.969791 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" podStartSLOduration=2.535903686 podStartE2EDuration="2.969772067s" podCreationTimestamp="2025-10-05 20:41:42 +0000 UTC" firstStartedPulling="2025-10-05 20:41:43.911162035 +0000 UTC m=+1612.759490267" lastFinishedPulling="2025-10-05 20:41:44.345030406 +0000 UTC m=+1613.193358648" observedRunningTime="2025-10-05 20:41:44.969616372 +0000 UTC m=+1613.817944644" watchObservedRunningTime="2025-10-05 20:41:44.969772067 +0000 UTC m=+1613.818100299" Oct 05 20:41:46 crc kubenswrapper[4753]: I1005 20:41:46.852717 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:41:46 crc kubenswrapper[4753]: E1005 20:41:46.853074 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:41:48 crc kubenswrapper[4753]: I1005 20:41:48.989084 4753 generic.go:334] "Generic (PLEG): container finished" podID="63b10f94-2d2e-402a-8f62-92c7e33eef92" containerID="c56cea440ef60fbb4b698d0d8bd8acd986b98357fe70706b0c44084e495e58fc" exitCode=0 Oct 05 20:41:48 crc kubenswrapper[4753]: I1005 20:41:48.989196 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" event={"ID":"63b10f94-2d2e-402a-8f62-92c7e33eef92","Type":"ContainerDied","Data":"c56cea440ef60fbb4b698d0d8bd8acd986b98357fe70706b0c44084e495e58fc"} Oct 05 20:41:50 crc kubenswrapper[4753]: I1005 20:41:50.487153 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" Oct 05 20:41:50 crc kubenswrapper[4753]: I1005 20:41:50.637652 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63b10f94-2d2e-402a-8f62-92c7e33eef92-ssh-key\") pod \"63b10f94-2d2e-402a-8f62-92c7e33eef92\" (UID: \"63b10f94-2d2e-402a-8f62-92c7e33eef92\") " Oct 05 20:41:50 crc kubenswrapper[4753]: I1005 20:41:50.637694 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76r9b\" (UniqueName: \"kubernetes.io/projected/63b10f94-2d2e-402a-8f62-92c7e33eef92-kube-api-access-76r9b\") pod \"63b10f94-2d2e-402a-8f62-92c7e33eef92\" (UID: \"63b10f94-2d2e-402a-8f62-92c7e33eef92\") " Oct 05 20:41:50 crc kubenswrapper[4753]: I1005 20:41:50.637721 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63b10f94-2d2e-402a-8f62-92c7e33eef92-inventory\") pod \"63b10f94-2d2e-402a-8f62-92c7e33eef92\" (UID: \"63b10f94-2d2e-402a-8f62-92c7e33eef92\") " Oct 05 20:41:50 crc kubenswrapper[4753]: I1005 20:41:50.648502 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63b10f94-2d2e-402a-8f62-92c7e33eef92-kube-api-access-76r9b" (OuterVolumeSpecName: "kube-api-access-76r9b") pod "63b10f94-2d2e-402a-8f62-92c7e33eef92" (UID: "63b10f94-2d2e-402a-8f62-92c7e33eef92"). InnerVolumeSpecName "kube-api-access-76r9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:41:50 crc kubenswrapper[4753]: I1005 20:41:50.664404 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63b10f94-2d2e-402a-8f62-92c7e33eef92-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "63b10f94-2d2e-402a-8f62-92c7e33eef92" (UID: "63b10f94-2d2e-402a-8f62-92c7e33eef92"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:41:50 crc kubenswrapper[4753]: I1005 20:41:50.698478 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63b10f94-2d2e-402a-8f62-92c7e33eef92-inventory" (OuterVolumeSpecName: "inventory") pod "63b10f94-2d2e-402a-8f62-92c7e33eef92" (UID: "63b10f94-2d2e-402a-8f62-92c7e33eef92"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:41:50 crc kubenswrapper[4753]: I1005 20:41:50.740585 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63b10f94-2d2e-402a-8f62-92c7e33eef92-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:41:50 crc kubenswrapper[4753]: I1005 20:41:50.740637 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76r9b\" (UniqueName: \"kubernetes.io/projected/63b10f94-2d2e-402a-8f62-92c7e33eef92-kube-api-access-76r9b\") on node \"crc\" DevicePath \"\"" Oct 05 20:41:50 crc kubenswrapper[4753]: I1005 20:41:50.740658 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63b10f94-2d2e-402a-8f62-92c7e33eef92-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.021661 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" event={"ID":"63b10f94-2d2e-402a-8f62-92c7e33eef92","Type":"ContainerDied","Data":"acb5f08449e31d706456802bca2e4473706cbee6048b619b8a2d0f72a407343a"} Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.021721 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acb5f08449e31d706456802bca2e4473706cbee6048b619b8a2d0f72a407343a" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.021799 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.104595 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll"] Oct 05 20:41:51 crc kubenswrapper[4753]: E1005 20:41:51.105222 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63b10f94-2d2e-402a-8f62-92c7e33eef92" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.105244 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="63b10f94-2d2e-402a-8f62-92c7e33eef92" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.105423 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="63b10f94-2d2e-402a-8f62-92c7e33eef92" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.106126 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.110175 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll"] Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.114454 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.114523 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.114687 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.114884 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.251288 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll\" (UID: \"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.251421 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d95sb\" (UniqueName: \"kubernetes.io/projected/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-kube-api-access-d95sb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll\" (UID: \"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.251589 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll\" (UID: \"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.353840 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll\" (UID: \"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.353893 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d95sb\" (UniqueName: \"kubernetes.io/projected/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-kube-api-access-d95sb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll\" (UID: \"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.353957 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll\" (UID: \"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.357206 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll\" (UID: \"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.357970 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll\" (UID: \"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.369353 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d95sb\" (UniqueName: \"kubernetes.io/projected/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-kube-api-access-d95sb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll\" (UID: \"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.431661 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" Oct 05 20:41:51 crc kubenswrapper[4753]: I1005 20:41:51.954723 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll"] Oct 05 20:41:51 crc kubenswrapper[4753]: W1005 20:41:51.957935 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b8eb5c9_8b94_486b_b8e2_f3ff1b889928.slice/crio-d84b8d08c3932c5ecd0333522fbdc488fa12e4ed2d14e9208bc57a5299777535 WatchSource:0}: Error finding container d84b8d08c3932c5ecd0333522fbdc488fa12e4ed2d14e9208bc57a5299777535: Status 404 returned error can't find the container with id d84b8d08c3932c5ecd0333522fbdc488fa12e4ed2d14e9208bc57a5299777535 Oct 05 20:41:52 crc kubenswrapper[4753]: I1005 20:41:52.030018 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" event={"ID":"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928","Type":"ContainerStarted","Data":"d84b8d08c3932c5ecd0333522fbdc488fa12e4ed2d14e9208bc57a5299777535"} Oct 05 20:41:52 crc kubenswrapper[4753]: I1005 20:41:52.742841 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:41:53 crc kubenswrapper[4753]: I1005 20:41:53.042512 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" event={"ID":"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928","Type":"ContainerStarted","Data":"5e591415b24672a107360f75a655a19c54cbdbcd5327db7892e22141bbf8c7cc"} Oct 05 20:41:53 crc kubenswrapper[4753]: I1005 20:41:53.076401 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" podStartSLOduration=1.296318152 podStartE2EDuration="2.076375458s" podCreationTimestamp="2025-10-05 20:41:51 +0000 UTC" firstStartedPulling="2025-10-05 20:41:51.959838616 +0000 UTC m=+1620.808166848" lastFinishedPulling="2025-10-05 20:41:52.739895912 +0000 UTC m=+1621.588224154" observedRunningTime="2025-10-05 20:41:53.060717423 +0000 UTC m=+1621.909045735" watchObservedRunningTime="2025-10-05 20:41:53.076375458 +0000 UTC m=+1621.924703720" Oct 05 20:41:55 crc kubenswrapper[4753]: I1005 20:41:55.039493 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-tnf7s"] Oct 05 20:41:55 crc kubenswrapper[4753]: I1005 20:41:55.048403 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-tnf7s"] Oct 05 20:41:55 crc kubenswrapper[4753]: I1005 20:41:55.863778 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3841483-2af9-40d8-8197-531d8dd1e57f" path="/var/lib/kubelet/pods/e3841483-2af9-40d8-8197-531d8dd1e57f/volumes" Oct 05 20:41:56 crc kubenswrapper[4753]: I1005 20:41:56.042321 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-b97mg"] Oct 05 20:41:56 crc kubenswrapper[4753]: I1005 20:41:56.046673 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-b97mg"] Oct 05 20:41:57 crc kubenswrapper[4753]: I1005 20:41:57.852364 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:41:57 crc kubenswrapper[4753]: E1005 20:41:57.853136 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:41:57 crc kubenswrapper[4753]: I1005 20:41:57.863034 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa63f817-249e-416d-aa21-47fe6e04180c" path="/var/lib/kubelet/pods/fa63f817-249e-416d-aa21-47fe6e04180c/volumes" Oct 05 20:42:02 crc kubenswrapper[4753]: I1005 20:42:02.459389 4753 scope.go:117] "RemoveContainer" containerID="43014cc60553b045db9e915b5f2ec0958cf6323fcd52f2246bbd6f261ebe0483" Oct 05 20:42:02 crc kubenswrapper[4753]: I1005 20:42:02.522031 4753 scope.go:117] "RemoveContainer" containerID="bfeaa6dc1c6c9a840673b656a702681d5c9be98c793e4a857717f0653f7406b1" Oct 05 20:42:02 crc kubenswrapper[4753]: I1005 20:42:02.547683 4753 scope.go:117] "RemoveContainer" containerID="059fa7c26c414539ba189c6a39977d8007010aaa19369d67671d694f889828b7" Oct 05 20:42:02 crc kubenswrapper[4753]: I1005 20:42:02.608323 4753 scope.go:117] "RemoveContainer" containerID="500836df6df635c1e403426c6e29d4f986a6b8355e0f7dee8a877f4e5585f365" Oct 05 20:42:02 crc kubenswrapper[4753]: I1005 20:42:02.629806 4753 scope.go:117] "RemoveContainer" containerID="9b48a93f66579c4f92855e2f8cdd79a3a8cf212397ad1595b8b848f6832d9b3c" Oct 05 20:42:02 crc kubenswrapper[4753]: I1005 20:42:02.690246 4753 scope.go:117] "RemoveContainer" containerID="1f65b40afdeba568afe9de88bd091a6b6b710eede63ee93a09fb76c30c3a1b29" Oct 05 20:42:02 crc kubenswrapper[4753]: I1005 20:42:02.720076 4753 scope.go:117] "RemoveContainer" containerID="3f013a731f19326bb650fe30d8ba5ae07e7dcce7f5fd634c603aa328ad9655ab" Oct 05 20:42:02 crc kubenswrapper[4753]: I1005 20:42:02.740613 4753 scope.go:117] "RemoveContainer" containerID="3bd2a438f86333ebfd71c5c63d94276e3cb4fe2084ff7a5c42beaabddbd3988e" Oct 05 20:42:12 crc kubenswrapper[4753]: I1005 20:42:12.852281 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:42:12 crc kubenswrapper[4753]: E1005 20:42:12.853262 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:42:26 crc kubenswrapper[4753]: I1005 20:42:26.050610 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-4nx88"] Oct 05 20:42:26 crc kubenswrapper[4753]: I1005 20:42:26.065215 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-rbfbj"] Oct 05 20:42:26 crc kubenswrapper[4753]: I1005 20:42:26.076135 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-jk77b"] Oct 05 20:42:26 crc kubenswrapper[4753]: I1005 20:42:26.085977 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-4nx88"] Oct 05 20:42:26 crc kubenswrapper[4753]: I1005 20:42:26.095628 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-rbfbj"] Oct 05 20:42:26 crc kubenswrapper[4753]: I1005 20:42:26.102548 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-jk77b"] Oct 05 20:42:26 crc kubenswrapper[4753]: I1005 20:42:26.851834 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:42:26 crc kubenswrapper[4753]: E1005 20:42:26.852126 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:42:27 crc kubenswrapper[4753]: I1005 20:42:27.861425 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="959512b8-c8e2-41eb-9405-a99df49caf33" path="/var/lib/kubelet/pods/959512b8-c8e2-41eb-9405-a99df49caf33/volumes" Oct 05 20:42:27 crc kubenswrapper[4753]: I1005 20:42:27.862207 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b80ea4d-52cf-4876-bd3a-436c9e65c93f" path="/var/lib/kubelet/pods/9b80ea4d-52cf-4876-bd3a-436c9e65c93f/volumes" Oct 05 20:42:27 crc kubenswrapper[4753]: I1005 20:42:27.862643 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc" path="/var/lib/kubelet/pods/a1a250cc-b19d-4d1c-acd6-aa2c21ddc3bc/volumes" Oct 05 20:42:36 crc kubenswrapper[4753]: I1005 20:42:36.031399 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-3a64-account-create-b4hzd"] Oct 05 20:42:36 crc kubenswrapper[4753]: I1005 20:42:36.042192 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4cbd-account-create-6zmlh"] Oct 05 20:42:36 crc kubenswrapper[4753]: I1005 20:42:36.048711 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-15d6-account-create-fsjpw"] Oct 05 20:42:36 crc kubenswrapper[4753]: I1005 20:42:36.058122 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-15d6-account-create-fsjpw"] Oct 05 20:42:36 crc kubenswrapper[4753]: I1005 20:42:36.065304 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4cbd-account-create-6zmlh"] Oct 05 20:42:36 crc kubenswrapper[4753]: I1005 20:42:36.073580 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-3a64-account-create-b4hzd"] Oct 05 20:42:37 crc kubenswrapper[4753]: I1005 20:42:37.851973 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:42:37 crc kubenswrapper[4753]: E1005 20:42:37.852582 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:42:37 crc kubenswrapper[4753]: I1005 20:42:37.870988 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b219057-d805-46f6-b393-8c05f64ff2ce" path="/var/lib/kubelet/pods/1b219057-d805-46f6-b393-8c05f64ff2ce/volumes" Oct 05 20:42:37 crc kubenswrapper[4753]: I1005 20:42:37.872747 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d8b551b-c44d-42dc-8a3b-0fbf1df013f2" path="/var/lib/kubelet/pods/6d8b551b-c44d-42dc-8a3b-0fbf1df013f2/volumes" Oct 05 20:42:37 crc kubenswrapper[4753]: I1005 20:42:37.873936 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4" path="/var/lib/kubelet/pods/9ca7c978-fba4-4fdd-a6bb-649c0bfdd9c4/volumes" Oct 05 20:42:50 crc kubenswrapper[4753]: I1005 20:42:50.852731 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:42:50 crc kubenswrapper[4753]: E1005 20:42:50.853481 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:42:56 crc kubenswrapper[4753]: I1005 20:42:56.653613 4753 generic.go:334] "Generic (PLEG): container finished" podID="8b8eb5c9-8b94-486b-b8e2-f3ff1b889928" containerID="5e591415b24672a107360f75a655a19c54cbdbcd5327db7892e22141bbf8c7cc" exitCode=2 Oct 05 20:42:56 crc kubenswrapper[4753]: I1005 20:42:56.653705 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" event={"ID":"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928","Type":"ContainerDied","Data":"5e591415b24672a107360f75a655a19c54cbdbcd5327db7892e22141bbf8c7cc"} Oct 05 20:42:57 crc kubenswrapper[4753]: I1005 20:42:57.040512 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5hvlp"] Oct 05 20:42:57 crc kubenswrapper[4753]: I1005 20:42:57.046691 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5hvlp"] Oct 05 20:42:57 crc kubenswrapper[4753]: I1005 20:42:57.878127 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="698096ab-42c7-4f67-87b8-27d612fd3c25" path="/var/lib/kubelet/pods/698096ab-42c7-4f67-87b8-27d612fd3c25/volumes" Oct 05 20:42:58 crc kubenswrapper[4753]: I1005 20:42:58.087111 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" Oct 05 20:42:58 crc kubenswrapper[4753]: I1005 20:42:58.147167 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d95sb\" (UniqueName: \"kubernetes.io/projected/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-kube-api-access-d95sb\") pod \"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928\" (UID: \"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928\") " Oct 05 20:42:58 crc kubenswrapper[4753]: I1005 20:42:58.147286 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-ssh-key\") pod \"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928\" (UID: \"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928\") " Oct 05 20:42:58 crc kubenswrapper[4753]: I1005 20:42:58.147372 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-inventory\") pod \"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928\" (UID: \"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928\") " Oct 05 20:42:58 crc kubenswrapper[4753]: I1005 20:42:58.168461 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-kube-api-access-d95sb" (OuterVolumeSpecName: "kube-api-access-d95sb") pod "8b8eb5c9-8b94-486b-b8e2-f3ff1b889928" (UID: "8b8eb5c9-8b94-486b-b8e2-f3ff1b889928"). InnerVolumeSpecName "kube-api-access-d95sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:42:58 crc kubenswrapper[4753]: I1005 20:42:58.191934 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-inventory" (OuterVolumeSpecName: "inventory") pod "8b8eb5c9-8b94-486b-b8e2-f3ff1b889928" (UID: "8b8eb5c9-8b94-486b-b8e2-f3ff1b889928"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:42:58 crc kubenswrapper[4753]: I1005 20:42:58.195918 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8b8eb5c9-8b94-486b-b8e2-f3ff1b889928" (UID: "8b8eb5c9-8b94-486b-b8e2-f3ff1b889928"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:42:58 crc kubenswrapper[4753]: I1005 20:42:58.248991 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:42:58 crc kubenswrapper[4753]: I1005 20:42:58.249888 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d95sb\" (UniqueName: \"kubernetes.io/projected/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-kube-api-access-d95sb\") on node \"crc\" DevicePath \"\"" Oct 05 20:42:58 crc kubenswrapper[4753]: I1005 20:42:58.249926 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:42:58 crc kubenswrapper[4753]: I1005 20:42:58.668963 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" event={"ID":"8b8eb5c9-8b94-486b-b8e2-f3ff1b889928","Type":"ContainerDied","Data":"d84b8d08c3932c5ecd0333522fbdc488fa12e4ed2d14e9208bc57a5299777535"} Oct 05 20:42:58 crc kubenswrapper[4753]: I1005 20:42:58.669306 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d84b8d08c3932c5ecd0333522fbdc488fa12e4ed2d14e9208bc57a5299777535" Oct 05 20:42:58 crc kubenswrapper[4753]: I1005 20:42:58.669055 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll" Oct 05 20:43:02 crc kubenswrapper[4753]: I1005 20:43:02.896883 4753 scope.go:117] "RemoveContainer" containerID="9f3f526fa9e8a7d9f12fc7b2f037c218f766c9695038b539ff8561d71624116d" Oct 05 20:43:02 crc kubenswrapper[4753]: I1005 20:43:02.920738 4753 scope.go:117] "RemoveContainer" containerID="c59099e2f1ac96a60533b2e0becc9d6c8dcc380b6b707d1dee0df9f1587f1077" Oct 05 20:43:02 crc kubenswrapper[4753]: I1005 20:43:02.959168 4753 scope.go:117] "RemoveContainer" containerID="d78cf11b2a8724edb5e71180493e2d325fdec51bf36cf32c7f665541cfe3e31b" Oct 05 20:43:02 crc kubenswrapper[4753]: I1005 20:43:02.991676 4753 scope.go:117] "RemoveContainer" containerID="6494a3d3da0cdfcacd48411aca0233ba49d748583d65f344fa09d00ad812e2ff" Oct 05 20:43:03 crc kubenswrapper[4753]: I1005 20:43:03.058487 4753 scope.go:117] "RemoveContainer" containerID="b5791038b45dcc30ecf7e5daef87bc05e17f4d21d248cbe1e6b5b086e155a1c5" Oct 05 20:43:03 crc kubenswrapper[4753]: I1005 20:43:03.084546 4753 scope.go:117] "RemoveContainer" containerID="c0027ccdaa8a764b096bd84cda96635990185c17af50476fe240fafdc87c0a4b" Oct 05 20:43:03 crc kubenswrapper[4753]: I1005 20:43:03.117794 4753 scope.go:117] "RemoveContainer" containerID="930021d9264c79b3a661cdaabc77f3ad8ebf959f930b845dc018bf3d126170d7" Oct 05 20:43:03 crc kubenswrapper[4753]: I1005 20:43:03.852458 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:43:03 crc kubenswrapper[4753]: E1005 20:43:03.852858 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.029874 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4"] Oct 05 20:43:06 crc kubenswrapper[4753]: E1005 20:43:06.030610 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b8eb5c9-8b94-486b-b8e2-f3ff1b889928" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.030627 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b8eb5c9-8b94-486b-b8e2-f3ff1b889928" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.030878 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b8eb5c9-8b94-486b-b8e2-f3ff1b889928" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.031686 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.034493 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.034833 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.034890 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.035306 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.044271 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4"] Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.129993 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4603cbff-4678-463a-b7bb-187826d1717c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4\" (UID: \"4603cbff-4678-463a-b7bb-187826d1717c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.130058 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f58qv\" (UniqueName: \"kubernetes.io/projected/4603cbff-4678-463a-b7bb-187826d1717c-kube-api-access-f58qv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4\" (UID: \"4603cbff-4678-463a-b7bb-187826d1717c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.130193 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4603cbff-4678-463a-b7bb-187826d1717c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4\" (UID: \"4603cbff-4678-463a-b7bb-187826d1717c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.231996 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4603cbff-4678-463a-b7bb-187826d1717c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4\" (UID: \"4603cbff-4678-463a-b7bb-187826d1717c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.232071 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f58qv\" (UniqueName: \"kubernetes.io/projected/4603cbff-4678-463a-b7bb-187826d1717c-kube-api-access-f58qv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4\" (UID: \"4603cbff-4678-463a-b7bb-187826d1717c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.232152 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4603cbff-4678-463a-b7bb-187826d1717c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4\" (UID: \"4603cbff-4678-463a-b7bb-187826d1717c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.238426 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4603cbff-4678-463a-b7bb-187826d1717c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4\" (UID: \"4603cbff-4678-463a-b7bb-187826d1717c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.241444 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4603cbff-4678-463a-b7bb-187826d1717c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4\" (UID: \"4603cbff-4678-463a-b7bb-187826d1717c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.260646 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f58qv\" (UniqueName: \"kubernetes.io/projected/4603cbff-4678-463a-b7bb-187826d1717c-kube-api-access-f58qv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4\" (UID: \"4603cbff-4678-463a-b7bb-187826d1717c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.351596 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" Oct 05 20:43:06 crc kubenswrapper[4753]: W1005 20:43:06.945425 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4603cbff_4678_463a_b7bb_187826d1717c.slice/crio-caf147687f98c9ed79a5748ce832634d1caf49bedf9dcb2aa381067b194a8248 WatchSource:0}: Error finding container caf147687f98c9ed79a5748ce832634d1caf49bedf9dcb2aa381067b194a8248: Status 404 returned error can't find the container with id caf147687f98c9ed79a5748ce832634d1caf49bedf9dcb2aa381067b194a8248 Oct 05 20:43:06 crc kubenswrapper[4753]: I1005 20:43:06.947339 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4"] Oct 05 20:43:07 crc kubenswrapper[4753]: I1005 20:43:07.760095 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" event={"ID":"4603cbff-4678-463a-b7bb-187826d1717c","Type":"ContainerStarted","Data":"2ec01b1059ccf847a081ddaae6936d61d267e03806ffcfa6b8c49c4583313ecf"} Oct 05 20:43:07 crc kubenswrapper[4753]: I1005 20:43:07.760658 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" event={"ID":"4603cbff-4678-463a-b7bb-187826d1717c","Type":"ContainerStarted","Data":"caf147687f98c9ed79a5748ce832634d1caf49bedf9dcb2aa381067b194a8248"} Oct 05 20:43:07 crc kubenswrapper[4753]: I1005 20:43:07.803592 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" podStartSLOduration=1.268696635 podStartE2EDuration="1.80355786s" podCreationTimestamp="2025-10-05 20:43:06 +0000 UTC" firstStartedPulling="2025-10-05 20:43:06.947076582 +0000 UTC m=+1695.795404834" lastFinishedPulling="2025-10-05 20:43:07.481937827 +0000 UTC m=+1696.330266059" observedRunningTime="2025-10-05 20:43:07.783769567 +0000 UTC m=+1696.632097819" watchObservedRunningTime="2025-10-05 20:43:07.80355786 +0000 UTC m=+1696.651886132" Oct 05 20:43:17 crc kubenswrapper[4753]: I1005 20:43:17.852257 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:43:17 crc kubenswrapper[4753]: E1005 20:43:17.852995 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:43:19 crc kubenswrapper[4753]: I1005 20:43:19.044234 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-kcbx2"] Oct 05 20:43:19 crc kubenswrapper[4753]: I1005 20:43:19.051955 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-kcbx2"] Oct 05 20:43:19 crc kubenswrapper[4753]: I1005 20:43:19.885732 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3472112b-9f63-410c-b285-c5a8cd2fa2fc" path="/var/lib/kubelet/pods/3472112b-9f63-410c-b285-c5a8cd2fa2fc/volumes" Oct 05 20:43:20 crc kubenswrapper[4753]: I1005 20:43:20.033931 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7qts4"] Oct 05 20:43:20 crc kubenswrapper[4753]: I1005 20:43:20.046936 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7qts4"] Oct 05 20:43:21 crc kubenswrapper[4753]: I1005 20:43:21.874631 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72e71171-91fe-4161-9899-93934608eaa2" path="/var/lib/kubelet/pods/72e71171-91fe-4161-9899-93934608eaa2/volumes" Oct 05 20:43:30 crc kubenswrapper[4753]: I1005 20:43:30.852542 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:43:30 crc kubenswrapper[4753]: E1005 20:43:30.854393 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:43:44 crc kubenswrapper[4753]: I1005 20:43:44.852473 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:43:44 crc kubenswrapper[4753]: E1005 20:43:44.853327 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:43:57 crc kubenswrapper[4753]: I1005 20:43:57.852867 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:43:57 crc kubenswrapper[4753]: E1005 20:43:57.854448 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:44:03 crc kubenswrapper[4753]: I1005 20:44:03.249133 4753 scope.go:117] "RemoveContainer" containerID="7bd3f75ff782139c51c0348d117f57215ec660be1f07b009d61f048144dc9b0a" Oct 05 20:44:03 crc kubenswrapper[4753]: I1005 20:44:03.294723 4753 scope.go:117] "RemoveContainer" containerID="8b596b16decd9c2626d87ec85a3e00bbf094c8b9354c3ffb26d418d739010f29" Oct 05 20:44:03 crc kubenswrapper[4753]: I1005 20:44:03.303309 4753 generic.go:334] "Generic (PLEG): container finished" podID="4603cbff-4678-463a-b7bb-187826d1717c" containerID="2ec01b1059ccf847a081ddaae6936d61d267e03806ffcfa6b8c49c4583313ecf" exitCode=0 Oct 05 20:44:03 crc kubenswrapper[4753]: I1005 20:44:03.303363 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" event={"ID":"4603cbff-4678-463a-b7bb-187826d1717c","Type":"ContainerDied","Data":"2ec01b1059ccf847a081ddaae6936d61d267e03806ffcfa6b8c49c4583313ecf"} Oct 05 20:44:04 crc kubenswrapper[4753]: I1005 20:44:04.695159 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" Oct 05 20:44:04 crc kubenswrapper[4753]: I1005 20:44:04.760578 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f58qv\" (UniqueName: \"kubernetes.io/projected/4603cbff-4678-463a-b7bb-187826d1717c-kube-api-access-f58qv\") pod \"4603cbff-4678-463a-b7bb-187826d1717c\" (UID: \"4603cbff-4678-463a-b7bb-187826d1717c\") " Oct 05 20:44:04 crc kubenswrapper[4753]: I1005 20:44:04.760636 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4603cbff-4678-463a-b7bb-187826d1717c-ssh-key\") pod \"4603cbff-4678-463a-b7bb-187826d1717c\" (UID: \"4603cbff-4678-463a-b7bb-187826d1717c\") " Oct 05 20:44:04 crc kubenswrapper[4753]: I1005 20:44:04.760708 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4603cbff-4678-463a-b7bb-187826d1717c-inventory\") pod \"4603cbff-4678-463a-b7bb-187826d1717c\" (UID: \"4603cbff-4678-463a-b7bb-187826d1717c\") " Oct 05 20:44:04 crc kubenswrapper[4753]: I1005 20:44:04.783219 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4603cbff-4678-463a-b7bb-187826d1717c-kube-api-access-f58qv" (OuterVolumeSpecName: "kube-api-access-f58qv") pod "4603cbff-4678-463a-b7bb-187826d1717c" (UID: "4603cbff-4678-463a-b7bb-187826d1717c"). InnerVolumeSpecName "kube-api-access-f58qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:44:04 crc kubenswrapper[4753]: I1005 20:44:04.809638 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4603cbff-4678-463a-b7bb-187826d1717c-inventory" (OuterVolumeSpecName: "inventory") pod "4603cbff-4678-463a-b7bb-187826d1717c" (UID: "4603cbff-4678-463a-b7bb-187826d1717c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:44:04 crc kubenswrapper[4753]: I1005 20:44:04.813985 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4603cbff-4678-463a-b7bb-187826d1717c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4603cbff-4678-463a-b7bb-187826d1717c" (UID: "4603cbff-4678-463a-b7bb-187826d1717c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:44:04 crc kubenswrapper[4753]: I1005 20:44:04.863474 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f58qv\" (UniqueName: \"kubernetes.io/projected/4603cbff-4678-463a-b7bb-187826d1717c-kube-api-access-f58qv\") on node \"crc\" DevicePath \"\"" Oct 05 20:44:04 crc kubenswrapper[4753]: I1005 20:44:04.863521 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4603cbff-4678-463a-b7bb-187826d1717c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:44:04 crc kubenswrapper[4753]: I1005 20:44:04.863532 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4603cbff-4678-463a-b7bb-187826d1717c-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.324499 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" event={"ID":"4603cbff-4678-463a-b7bb-187826d1717c","Type":"ContainerDied","Data":"caf147687f98c9ed79a5748ce832634d1caf49bedf9dcb2aa381067b194a8248"} Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.324548 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caf147687f98c9ed79a5748ce832634d1caf49bedf9dcb2aa381067b194a8248" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.324611 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.433791 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-wr58b"] Oct 05 20:44:05 crc kubenswrapper[4753]: E1005 20:44:05.434608 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4603cbff-4678-463a-b7bb-187826d1717c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.434709 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4603cbff-4678-463a-b7bb-187826d1717c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.434944 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="4603cbff-4678-463a-b7bb-187826d1717c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.435608 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.437840 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.438504 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.438616 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.443337 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.458921 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-wr58b"] Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.482252 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf2a62ed-7175-456e-953b-551c3148ba91-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-wr58b\" (UID: \"bf2a62ed-7175-456e-953b-551c3148ba91\") " pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.482327 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bf2a62ed-7175-456e-953b-551c3148ba91-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-wr58b\" (UID: \"bf2a62ed-7175-456e-953b-551c3148ba91\") " pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.482355 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgnjl\" (UniqueName: \"kubernetes.io/projected/bf2a62ed-7175-456e-953b-551c3148ba91-kube-api-access-qgnjl\") pod \"ssh-known-hosts-edpm-deployment-wr58b\" (UID: \"bf2a62ed-7175-456e-953b-551c3148ba91\") " pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.584217 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf2a62ed-7175-456e-953b-551c3148ba91-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-wr58b\" (UID: \"bf2a62ed-7175-456e-953b-551c3148ba91\") " pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.584294 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bf2a62ed-7175-456e-953b-551c3148ba91-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-wr58b\" (UID: \"bf2a62ed-7175-456e-953b-551c3148ba91\") " pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.584334 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgnjl\" (UniqueName: \"kubernetes.io/projected/bf2a62ed-7175-456e-953b-551c3148ba91-kube-api-access-qgnjl\") pod \"ssh-known-hosts-edpm-deployment-wr58b\" (UID: \"bf2a62ed-7175-456e-953b-551c3148ba91\") " pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.588424 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf2a62ed-7175-456e-953b-551c3148ba91-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-wr58b\" (UID: \"bf2a62ed-7175-456e-953b-551c3148ba91\") " pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.588896 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bf2a62ed-7175-456e-953b-551c3148ba91-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-wr58b\" (UID: \"bf2a62ed-7175-456e-953b-551c3148ba91\") " pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.599482 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgnjl\" (UniqueName: \"kubernetes.io/projected/bf2a62ed-7175-456e-953b-551c3148ba91-kube-api-access-qgnjl\") pod \"ssh-known-hosts-edpm-deployment-wr58b\" (UID: \"bf2a62ed-7175-456e-953b-551c3148ba91\") " pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" Oct 05 20:44:05 crc kubenswrapper[4753]: I1005 20:44:05.757907 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" Oct 05 20:44:06 crc kubenswrapper[4753]: I1005 20:44:06.039800 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-2cbwk"] Oct 05 20:44:06 crc kubenswrapper[4753]: I1005 20:44:06.049290 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-2cbwk"] Oct 05 20:44:06 crc kubenswrapper[4753]: I1005 20:44:06.354022 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-wr58b"] Oct 05 20:44:06 crc kubenswrapper[4753]: W1005 20:44:06.370661 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf2a62ed_7175_456e_953b_551c3148ba91.slice/crio-7c20dfc6037baba6dcba4111c919a9a5a1f873ebe09f6baccb65c27e8257bc1b WatchSource:0}: Error finding container 7c20dfc6037baba6dcba4111c919a9a5a1f873ebe09f6baccb65c27e8257bc1b: Status 404 returned error can't find the container with id 7c20dfc6037baba6dcba4111c919a9a5a1f873ebe09f6baccb65c27e8257bc1b Oct 05 20:44:07 crc kubenswrapper[4753]: I1005 20:44:07.348957 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" event={"ID":"bf2a62ed-7175-456e-953b-551c3148ba91","Type":"ContainerStarted","Data":"6cb93bada9c1cf3185d356d6167b6c8ad40d5b60576815c72028d6dd9215d2fe"} Oct 05 20:44:07 crc kubenswrapper[4753]: I1005 20:44:07.349430 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" event={"ID":"bf2a62ed-7175-456e-953b-551c3148ba91","Type":"ContainerStarted","Data":"7c20dfc6037baba6dcba4111c919a9a5a1f873ebe09f6baccb65c27e8257bc1b"} Oct 05 20:44:07 crc kubenswrapper[4753]: I1005 20:44:07.374018 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" podStartSLOduration=1.948015922 podStartE2EDuration="2.373994867s" podCreationTimestamp="2025-10-05 20:44:05 +0000 UTC" firstStartedPulling="2025-10-05 20:44:06.374262271 +0000 UTC m=+1755.222590503" lastFinishedPulling="2025-10-05 20:44:06.800241216 +0000 UTC m=+1755.648569448" observedRunningTime="2025-10-05 20:44:07.370604531 +0000 UTC m=+1756.218932763" watchObservedRunningTime="2025-10-05 20:44:07.373994867 +0000 UTC m=+1756.222323109" Oct 05 20:44:07 crc kubenswrapper[4753]: I1005 20:44:07.864673 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d85233-42bd-4ee4-9d31-c0de142846b3" path="/var/lib/kubelet/pods/25d85233-42bd-4ee4-9d31-c0de142846b3/volumes" Oct 05 20:44:11 crc kubenswrapper[4753]: I1005 20:44:11.858756 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:44:11 crc kubenswrapper[4753]: E1005 20:44:11.859000 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:44:15 crc kubenswrapper[4753]: I1005 20:44:15.425970 4753 generic.go:334] "Generic (PLEG): container finished" podID="bf2a62ed-7175-456e-953b-551c3148ba91" containerID="6cb93bada9c1cf3185d356d6167b6c8ad40d5b60576815c72028d6dd9215d2fe" exitCode=0 Oct 05 20:44:15 crc kubenswrapper[4753]: I1005 20:44:15.426814 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" event={"ID":"bf2a62ed-7175-456e-953b-551c3148ba91","Type":"ContainerDied","Data":"6cb93bada9c1cf3185d356d6167b6c8ad40d5b60576815c72028d6dd9215d2fe"} Oct 05 20:44:16 crc kubenswrapper[4753]: I1005 20:44:16.836057 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" Oct 05 20:44:16 crc kubenswrapper[4753]: I1005 20:44:16.891708 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf2a62ed-7175-456e-953b-551c3148ba91-ssh-key-openstack-edpm-ipam\") pod \"bf2a62ed-7175-456e-953b-551c3148ba91\" (UID: \"bf2a62ed-7175-456e-953b-551c3148ba91\") " Oct 05 20:44:16 crc kubenswrapper[4753]: I1005 20:44:16.891956 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgnjl\" (UniqueName: \"kubernetes.io/projected/bf2a62ed-7175-456e-953b-551c3148ba91-kube-api-access-qgnjl\") pod \"bf2a62ed-7175-456e-953b-551c3148ba91\" (UID: \"bf2a62ed-7175-456e-953b-551c3148ba91\") " Oct 05 20:44:16 crc kubenswrapper[4753]: I1005 20:44:16.892024 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bf2a62ed-7175-456e-953b-551c3148ba91-inventory-0\") pod \"bf2a62ed-7175-456e-953b-551c3148ba91\" (UID: \"bf2a62ed-7175-456e-953b-551c3148ba91\") " Oct 05 20:44:16 crc kubenswrapper[4753]: I1005 20:44:16.899513 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf2a62ed-7175-456e-953b-551c3148ba91-kube-api-access-qgnjl" (OuterVolumeSpecName: "kube-api-access-qgnjl") pod "bf2a62ed-7175-456e-953b-551c3148ba91" (UID: "bf2a62ed-7175-456e-953b-551c3148ba91"). InnerVolumeSpecName "kube-api-access-qgnjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:44:16 crc kubenswrapper[4753]: I1005 20:44:16.916472 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf2a62ed-7175-456e-953b-551c3148ba91-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "bf2a62ed-7175-456e-953b-551c3148ba91" (UID: "bf2a62ed-7175-456e-953b-551c3148ba91"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:44:16 crc kubenswrapper[4753]: I1005 20:44:16.918551 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf2a62ed-7175-456e-953b-551c3148ba91-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bf2a62ed-7175-456e-953b-551c3148ba91" (UID: "bf2a62ed-7175-456e-953b-551c3148ba91"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:44:16 crc kubenswrapper[4753]: I1005 20:44:16.994016 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgnjl\" (UniqueName: \"kubernetes.io/projected/bf2a62ed-7175-456e-953b-551c3148ba91-kube-api-access-qgnjl\") on node \"crc\" DevicePath \"\"" Oct 05 20:44:16 crc kubenswrapper[4753]: I1005 20:44:16.994052 4753 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bf2a62ed-7175-456e-953b-551c3148ba91-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 05 20:44:16 crc kubenswrapper[4753]: I1005 20:44:16.994063 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf2a62ed-7175-456e-953b-551c3148ba91-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.493437 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" event={"ID":"bf2a62ed-7175-456e-953b-551c3148ba91","Type":"ContainerDied","Data":"7c20dfc6037baba6dcba4111c919a9a5a1f873ebe09f6baccb65c27e8257bc1b"} Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.493709 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c20dfc6037baba6dcba4111c919a9a5a1f873ebe09f6baccb65c27e8257bc1b" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.493842 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-wr58b" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.533589 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh"] Oct 05 20:44:17 crc kubenswrapper[4753]: E1005 20:44:17.534365 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2a62ed-7175-456e-953b-551c3148ba91" containerName="ssh-known-hosts-edpm-deployment" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.534493 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2a62ed-7175-456e-953b-551c3148ba91" containerName="ssh-known-hosts-edpm-deployment" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.535032 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf2a62ed-7175-456e-953b-551c3148ba91" containerName="ssh-known-hosts-edpm-deployment" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.535870 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.539040 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.539100 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.539609 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.542608 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.549008 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh"] Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.602903 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14e64073-f43f-4db5-9c64-78f928a2e022-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dtwmh\" (UID: \"14e64073-f43f-4db5-9c64-78f928a2e022\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.602958 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhwjg\" (UniqueName: \"kubernetes.io/projected/14e64073-f43f-4db5-9c64-78f928a2e022-kube-api-access-fhwjg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dtwmh\" (UID: \"14e64073-f43f-4db5-9c64-78f928a2e022\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.602995 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14e64073-f43f-4db5-9c64-78f928a2e022-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dtwmh\" (UID: \"14e64073-f43f-4db5-9c64-78f928a2e022\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.704794 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14e64073-f43f-4db5-9c64-78f928a2e022-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dtwmh\" (UID: \"14e64073-f43f-4db5-9c64-78f928a2e022\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.704987 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14e64073-f43f-4db5-9c64-78f928a2e022-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dtwmh\" (UID: \"14e64073-f43f-4db5-9c64-78f928a2e022\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.705044 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhwjg\" (UniqueName: \"kubernetes.io/projected/14e64073-f43f-4db5-9c64-78f928a2e022-kube-api-access-fhwjg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dtwmh\" (UID: \"14e64073-f43f-4db5-9c64-78f928a2e022\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.709784 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14e64073-f43f-4db5-9c64-78f928a2e022-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dtwmh\" (UID: \"14e64073-f43f-4db5-9c64-78f928a2e022\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.715592 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14e64073-f43f-4db5-9c64-78f928a2e022-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dtwmh\" (UID: \"14e64073-f43f-4db5-9c64-78f928a2e022\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.722601 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhwjg\" (UniqueName: \"kubernetes.io/projected/14e64073-f43f-4db5-9c64-78f928a2e022-kube-api-access-fhwjg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dtwmh\" (UID: \"14e64073-f43f-4db5-9c64-78f928a2e022\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" Oct 05 20:44:17 crc kubenswrapper[4753]: I1005 20:44:17.861010 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" Oct 05 20:44:18 crc kubenswrapper[4753]: I1005 20:44:18.252404 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh"] Oct 05 20:44:18 crc kubenswrapper[4753]: I1005 20:44:18.500621 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" event={"ID":"14e64073-f43f-4db5-9c64-78f928a2e022","Type":"ContainerStarted","Data":"6d7ddfeac19fb746fbddff99a812f554ed8ef692949903a49a71fcdfe4e3d288"} Oct 05 20:44:19 crc kubenswrapper[4753]: I1005 20:44:19.513328 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" event={"ID":"14e64073-f43f-4db5-9c64-78f928a2e022","Type":"ContainerStarted","Data":"f8b58365d310f3ee39931e5fdc515716c983721f6a42c06d4a70680276cbc356"} Oct 05 20:44:19 crc kubenswrapper[4753]: I1005 20:44:19.572110 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" podStartSLOduration=2.125132165 podStartE2EDuration="2.572081969s" podCreationTimestamp="2025-10-05 20:44:17 +0000 UTC" firstStartedPulling="2025-10-05 20:44:18.263315261 +0000 UTC m=+1767.111643493" lastFinishedPulling="2025-10-05 20:44:18.710265055 +0000 UTC m=+1767.558593297" observedRunningTime="2025-10-05 20:44:19.551924993 +0000 UTC m=+1768.400253265" watchObservedRunningTime="2025-10-05 20:44:19.572081969 +0000 UTC m=+1768.420410211" Oct 05 20:44:25 crc kubenswrapper[4753]: I1005 20:44:25.852206 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:44:25 crc kubenswrapper[4753]: E1005 20:44:25.852937 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:44:28 crc kubenswrapper[4753]: I1005 20:44:28.616015 4753 generic.go:334] "Generic (PLEG): container finished" podID="14e64073-f43f-4db5-9c64-78f928a2e022" containerID="f8b58365d310f3ee39931e5fdc515716c983721f6a42c06d4a70680276cbc356" exitCode=0 Oct 05 20:44:28 crc kubenswrapper[4753]: I1005 20:44:28.616316 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" event={"ID":"14e64073-f43f-4db5-9c64-78f928a2e022","Type":"ContainerDied","Data":"f8b58365d310f3ee39931e5fdc515716c983721f6a42c06d4a70680276cbc356"} Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.093478 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.152941 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14e64073-f43f-4db5-9c64-78f928a2e022-ssh-key\") pod \"14e64073-f43f-4db5-9c64-78f928a2e022\" (UID: \"14e64073-f43f-4db5-9c64-78f928a2e022\") " Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.153018 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhwjg\" (UniqueName: \"kubernetes.io/projected/14e64073-f43f-4db5-9c64-78f928a2e022-kube-api-access-fhwjg\") pod \"14e64073-f43f-4db5-9c64-78f928a2e022\" (UID: \"14e64073-f43f-4db5-9c64-78f928a2e022\") " Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.153058 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14e64073-f43f-4db5-9c64-78f928a2e022-inventory\") pod \"14e64073-f43f-4db5-9c64-78f928a2e022\" (UID: \"14e64073-f43f-4db5-9c64-78f928a2e022\") " Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.159868 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14e64073-f43f-4db5-9c64-78f928a2e022-kube-api-access-fhwjg" (OuterVolumeSpecName: "kube-api-access-fhwjg") pod "14e64073-f43f-4db5-9c64-78f928a2e022" (UID: "14e64073-f43f-4db5-9c64-78f928a2e022"). InnerVolumeSpecName "kube-api-access-fhwjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.178645 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e64073-f43f-4db5-9c64-78f928a2e022-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "14e64073-f43f-4db5-9c64-78f928a2e022" (UID: "14e64073-f43f-4db5-9c64-78f928a2e022"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.187615 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e64073-f43f-4db5-9c64-78f928a2e022-inventory" (OuterVolumeSpecName: "inventory") pod "14e64073-f43f-4db5-9c64-78f928a2e022" (UID: "14e64073-f43f-4db5-9c64-78f928a2e022"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.255516 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14e64073-f43f-4db5-9c64-78f928a2e022-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.255559 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhwjg\" (UniqueName: \"kubernetes.io/projected/14e64073-f43f-4db5-9c64-78f928a2e022-kube-api-access-fhwjg\") on node \"crc\" DevicePath \"\"" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.255577 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14e64073-f43f-4db5-9c64-78f928a2e022-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.639114 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" event={"ID":"14e64073-f43f-4db5-9c64-78f928a2e022","Type":"ContainerDied","Data":"6d7ddfeac19fb746fbddff99a812f554ed8ef692949903a49a71fcdfe4e3d288"} Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.639188 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d7ddfeac19fb746fbddff99a812f554ed8ef692949903a49a71fcdfe4e3d288" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.639222 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.719730 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5"] Oct 05 20:44:30 crc kubenswrapper[4753]: E1005 20:44:30.720136 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14e64073-f43f-4db5-9c64-78f928a2e022" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.720176 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="14e64073-f43f-4db5-9c64-78f928a2e022" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.720399 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="14e64073-f43f-4db5-9c64-78f928a2e022" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.721102 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.722704 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.725781 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.725789 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.726118 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.743571 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5"] Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.765705 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d65339d-58d4-4124-9e26-1997be504d6a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5\" (UID: \"7d65339d-58d4-4124-9e26-1997be504d6a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.766198 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff27q\" (UniqueName: \"kubernetes.io/projected/7d65339d-58d4-4124-9e26-1997be504d6a-kube-api-access-ff27q\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5\" (UID: \"7d65339d-58d4-4124-9e26-1997be504d6a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.766295 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d65339d-58d4-4124-9e26-1997be504d6a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5\" (UID: \"7d65339d-58d4-4124-9e26-1997be504d6a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.868130 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d65339d-58d4-4124-9e26-1997be504d6a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5\" (UID: \"7d65339d-58d4-4124-9e26-1997be504d6a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.868467 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d65339d-58d4-4124-9e26-1997be504d6a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5\" (UID: \"7d65339d-58d4-4124-9e26-1997be504d6a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.868632 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff27q\" (UniqueName: \"kubernetes.io/projected/7d65339d-58d4-4124-9e26-1997be504d6a-kube-api-access-ff27q\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5\" (UID: \"7d65339d-58d4-4124-9e26-1997be504d6a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.872950 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d65339d-58d4-4124-9e26-1997be504d6a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5\" (UID: \"7d65339d-58d4-4124-9e26-1997be504d6a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.875794 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d65339d-58d4-4124-9e26-1997be504d6a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5\" (UID: \"7d65339d-58d4-4124-9e26-1997be504d6a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" Oct 05 20:44:30 crc kubenswrapper[4753]: I1005 20:44:30.890604 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff27q\" (UniqueName: \"kubernetes.io/projected/7d65339d-58d4-4124-9e26-1997be504d6a-kube-api-access-ff27q\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5\" (UID: \"7d65339d-58d4-4124-9e26-1997be504d6a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" Oct 05 20:44:31 crc kubenswrapper[4753]: I1005 20:44:31.048499 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" Oct 05 20:44:31 crc kubenswrapper[4753]: I1005 20:44:31.608596 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5"] Oct 05 20:44:31 crc kubenswrapper[4753]: I1005 20:44:31.648956 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" event={"ID":"7d65339d-58d4-4124-9e26-1997be504d6a","Type":"ContainerStarted","Data":"909c4ebd049de09f04f91a136142a069a779f064b6d6838207e19fae0a63ee34"} Oct 05 20:44:32 crc kubenswrapper[4753]: I1005 20:44:32.662121 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" event={"ID":"7d65339d-58d4-4124-9e26-1997be504d6a","Type":"ContainerStarted","Data":"f20abac237a6164b7dd0ca42d13ecda28b61969f1b84cf636c833d7b780bc91a"} Oct 05 20:44:32 crc kubenswrapper[4753]: I1005 20:44:32.701309 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" podStartSLOduration=2.30646154 podStartE2EDuration="2.701276496s" podCreationTimestamp="2025-10-05 20:44:30 +0000 UTC" firstStartedPulling="2025-10-05 20:44:31.627385089 +0000 UTC m=+1780.475713331" lastFinishedPulling="2025-10-05 20:44:32.022200055 +0000 UTC m=+1780.870528287" observedRunningTime="2025-10-05 20:44:32.687032794 +0000 UTC m=+1781.535361066" watchObservedRunningTime="2025-10-05 20:44:32.701276496 +0000 UTC m=+1781.549604758" Oct 05 20:44:39 crc kubenswrapper[4753]: I1005 20:44:39.853465 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:44:39 crc kubenswrapper[4753]: E1005 20:44:39.854566 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:44:42 crc kubenswrapper[4753]: I1005 20:44:42.783796 4753 generic.go:334] "Generic (PLEG): container finished" podID="7d65339d-58d4-4124-9e26-1997be504d6a" containerID="f20abac237a6164b7dd0ca42d13ecda28b61969f1b84cf636c833d7b780bc91a" exitCode=0 Oct 05 20:44:42 crc kubenswrapper[4753]: I1005 20:44:42.783905 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" event={"ID":"7d65339d-58d4-4124-9e26-1997be504d6a","Type":"ContainerDied","Data":"f20abac237a6164b7dd0ca42d13ecda28b61969f1b84cf636c833d7b780bc91a"} Oct 05 20:44:44 crc kubenswrapper[4753]: I1005 20:44:44.217934 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" Oct 05 20:44:44 crc kubenswrapper[4753]: I1005 20:44:44.344962 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff27q\" (UniqueName: \"kubernetes.io/projected/7d65339d-58d4-4124-9e26-1997be504d6a-kube-api-access-ff27q\") pod \"7d65339d-58d4-4124-9e26-1997be504d6a\" (UID: \"7d65339d-58d4-4124-9e26-1997be504d6a\") " Oct 05 20:44:44 crc kubenswrapper[4753]: I1005 20:44:44.345157 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d65339d-58d4-4124-9e26-1997be504d6a-ssh-key\") pod \"7d65339d-58d4-4124-9e26-1997be504d6a\" (UID: \"7d65339d-58d4-4124-9e26-1997be504d6a\") " Oct 05 20:44:44 crc kubenswrapper[4753]: I1005 20:44:44.345216 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d65339d-58d4-4124-9e26-1997be504d6a-inventory\") pod \"7d65339d-58d4-4124-9e26-1997be504d6a\" (UID: \"7d65339d-58d4-4124-9e26-1997be504d6a\") " Oct 05 20:44:44 crc kubenswrapper[4753]: I1005 20:44:44.350908 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d65339d-58d4-4124-9e26-1997be504d6a-kube-api-access-ff27q" (OuterVolumeSpecName: "kube-api-access-ff27q") pod "7d65339d-58d4-4124-9e26-1997be504d6a" (UID: "7d65339d-58d4-4124-9e26-1997be504d6a"). InnerVolumeSpecName "kube-api-access-ff27q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:44:44 crc kubenswrapper[4753]: I1005 20:44:44.371796 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d65339d-58d4-4124-9e26-1997be504d6a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7d65339d-58d4-4124-9e26-1997be504d6a" (UID: "7d65339d-58d4-4124-9e26-1997be504d6a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:44:44 crc kubenswrapper[4753]: I1005 20:44:44.373804 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d65339d-58d4-4124-9e26-1997be504d6a-inventory" (OuterVolumeSpecName: "inventory") pod "7d65339d-58d4-4124-9e26-1997be504d6a" (UID: "7d65339d-58d4-4124-9e26-1997be504d6a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:44:44 crc kubenswrapper[4753]: I1005 20:44:44.447843 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff27q\" (UniqueName: \"kubernetes.io/projected/7d65339d-58d4-4124-9e26-1997be504d6a-kube-api-access-ff27q\") on node \"crc\" DevicePath \"\"" Oct 05 20:44:44 crc kubenswrapper[4753]: I1005 20:44:44.447873 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d65339d-58d4-4124-9e26-1997be504d6a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:44:44 crc kubenswrapper[4753]: I1005 20:44:44.447885 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d65339d-58d4-4124-9e26-1997be504d6a-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:44:44 crc kubenswrapper[4753]: I1005 20:44:44.808545 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" event={"ID":"7d65339d-58d4-4124-9e26-1997be504d6a","Type":"ContainerDied","Data":"909c4ebd049de09f04f91a136142a069a779f064b6d6838207e19fae0a63ee34"} Oct 05 20:44:44 crc kubenswrapper[4753]: I1005 20:44:44.808608 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="909c4ebd049de09f04f91a136142a069a779f064b6d6838207e19fae0a63ee34" Oct 05 20:44:44 crc kubenswrapper[4753]: I1005 20:44:44.808896 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5" Oct 05 20:44:52 crc kubenswrapper[4753]: I1005 20:44:52.852659 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:44:52 crc kubenswrapper[4753]: E1005 20:44:52.853369 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.187125 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46"] Oct 05 20:45:00 crc kubenswrapper[4753]: E1005 20:45:00.188115 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d65339d-58d4-4124-9e26-1997be504d6a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.188134 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d65339d-58d4-4124-9e26-1997be504d6a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.188358 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d65339d-58d4-4124-9e26-1997be504d6a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.189019 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.191127 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.191441 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.203207 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46"] Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.343436 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr4lm\" (UniqueName: \"kubernetes.io/projected/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-kube-api-access-kr4lm\") pod \"collect-profiles-29328285-56g46\" (UID: \"2f0158d5-f700-4d7c-a6e9-54f55bfc830c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.343599 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-secret-volume\") pod \"collect-profiles-29328285-56g46\" (UID: \"2f0158d5-f700-4d7c-a6e9-54f55bfc830c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.343733 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-config-volume\") pod \"collect-profiles-29328285-56g46\" (UID: \"2f0158d5-f700-4d7c-a6e9-54f55bfc830c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.445833 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr4lm\" (UniqueName: \"kubernetes.io/projected/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-kube-api-access-kr4lm\") pod \"collect-profiles-29328285-56g46\" (UID: \"2f0158d5-f700-4d7c-a6e9-54f55bfc830c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.446174 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-secret-volume\") pod \"collect-profiles-29328285-56g46\" (UID: \"2f0158d5-f700-4d7c-a6e9-54f55bfc830c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.446234 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-config-volume\") pod \"collect-profiles-29328285-56g46\" (UID: \"2f0158d5-f700-4d7c-a6e9-54f55bfc830c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.447253 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-config-volume\") pod \"collect-profiles-29328285-56g46\" (UID: \"2f0158d5-f700-4d7c-a6e9-54f55bfc830c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.452049 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-secret-volume\") pod \"collect-profiles-29328285-56g46\" (UID: \"2f0158d5-f700-4d7c-a6e9-54f55bfc830c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.469692 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr4lm\" (UniqueName: \"kubernetes.io/projected/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-kube-api-access-kr4lm\") pod \"collect-profiles-29328285-56g46\" (UID: \"2f0158d5-f700-4d7c-a6e9-54f55bfc830c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.511683 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46" Oct 05 20:45:00 crc kubenswrapper[4753]: I1005 20:45:00.957560 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46"] Oct 05 20:45:01 crc kubenswrapper[4753]: I1005 20:45:01.974272 4753 generic.go:334] "Generic (PLEG): container finished" podID="2f0158d5-f700-4d7c-a6e9-54f55bfc830c" containerID="48acf24bf446a951dd685758b111a6f4ae68ab851aea2e81b075aebb20a3ea6e" exitCode=0 Oct 05 20:45:01 crc kubenswrapper[4753]: I1005 20:45:01.974308 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46" event={"ID":"2f0158d5-f700-4d7c-a6e9-54f55bfc830c","Type":"ContainerDied","Data":"48acf24bf446a951dd685758b111a6f4ae68ab851aea2e81b075aebb20a3ea6e"} Oct 05 20:45:01 crc kubenswrapper[4753]: I1005 20:45:01.974563 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46" event={"ID":"2f0158d5-f700-4d7c-a6e9-54f55bfc830c","Type":"ContainerStarted","Data":"a43fd57137f9c6fee6b996452f68a483c4ecbd80fa3ba8e699b09a7db23b5e59"} Oct 05 20:45:03 crc kubenswrapper[4753]: I1005 20:45:03.281288 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46" Oct 05 20:45:03 crc kubenswrapper[4753]: I1005 20:45:03.404690 4753 scope.go:117] "RemoveContainer" containerID="12598f685c16233aa70f1178fe26d38534b0986da8a8be2ec58412701847578b" Oct 05 20:45:03 crc kubenswrapper[4753]: I1005 20:45:03.412118 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-config-volume\") pod \"2f0158d5-f700-4d7c-a6e9-54f55bfc830c\" (UID: \"2f0158d5-f700-4d7c-a6e9-54f55bfc830c\") " Oct 05 20:45:03 crc kubenswrapper[4753]: I1005 20:45:03.412594 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-secret-volume\") pod \"2f0158d5-f700-4d7c-a6e9-54f55bfc830c\" (UID: \"2f0158d5-f700-4d7c-a6e9-54f55bfc830c\") " Oct 05 20:45:03 crc kubenswrapper[4753]: I1005 20:45:03.412651 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr4lm\" (UniqueName: \"kubernetes.io/projected/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-kube-api-access-kr4lm\") pod \"2f0158d5-f700-4d7c-a6e9-54f55bfc830c\" (UID: \"2f0158d5-f700-4d7c-a6e9-54f55bfc830c\") " Oct 05 20:45:03 crc kubenswrapper[4753]: I1005 20:45:03.412783 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-config-volume" (OuterVolumeSpecName: "config-volume") pod "2f0158d5-f700-4d7c-a6e9-54f55bfc830c" (UID: "2f0158d5-f700-4d7c-a6e9-54f55bfc830c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:45:03 crc kubenswrapper[4753]: I1005 20:45:03.413569 4753 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 05 20:45:03 crc kubenswrapper[4753]: I1005 20:45:03.419525 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-kube-api-access-kr4lm" (OuterVolumeSpecName: "kube-api-access-kr4lm") pod "2f0158d5-f700-4d7c-a6e9-54f55bfc830c" (UID: "2f0158d5-f700-4d7c-a6e9-54f55bfc830c"). InnerVolumeSpecName "kube-api-access-kr4lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:45:03 crc kubenswrapper[4753]: I1005 20:45:03.427409 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2f0158d5-f700-4d7c-a6e9-54f55bfc830c" (UID: "2f0158d5-f700-4d7c-a6e9-54f55bfc830c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:45:03 crc kubenswrapper[4753]: I1005 20:45:03.515769 4753 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 05 20:45:03 crc kubenswrapper[4753]: I1005 20:45:03.515808 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr4lm\" (UniqueName: \"kubernetes.io/projected/2f0158d5-f700-4d7c-a6e9-54f55bfc830c-kube-api-access-kr4lm\") on node \"crc\" DevicePath \"\"" Oct 05 20:45:03 crc kubenswrapper[4753]: I1005 20:45:03.851871 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:45:03 crc kubenswrapper[4753]: E1005 20:45:03.852341 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:45:03 crc kubenswrapper[4753]: I1005 20:45:03.990028 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46" event={"ID":"2f0158d5-f700-4d7c-a6e9-54f55bfc830c","Type":"ContainerDied","Data":"a43fd57137f9c6fee6b996452f68a483c4ecbd80fa3ba8e699b09a7db23b5e59"} Oct 05 20:45:03 crc kubenswrapper[4753]: I1005 20:45:03.990115 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a43fd57137f9c6fee6b996452f68a483c4ecbd80fa3ba8e699b09a7db23b5e59" Oct 05 20:45:03 crc kubenswrapper[4753]: I1005 20:45:03.990090 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46" Oct 05 20:45:17 crc kubenswrapper[4753]: I1005 20:45:17.852597 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:45:17 crc kubenswrapper[4753]: E1005 20:45:17.853378 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:45:28 crc kubenswrapper[4753]: I1005 20:45:28.852333 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:45:28 crc kubenswrapper[4753]: E1005 20:45:28.853059 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:45:43 crc kubenswrapper[4753]: I1005 20:45:43.853797 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:45:43 crc kubenswrapper[4753]: E1005 20:45:43.855282 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:45:57 crc kubenswrapper[4753]: I1005 20:45:57.852536 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:45:57 crc kubenswrapper[4753]: E1005 20:45:57.853260 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:46:08 crc kubenswrapper[4753]: I1005 20:46:08.852300 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:46:09 crc kubenswrapper[4753]: I1005 20:46:09.590532 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"5952695b2bbac0c6b13de72d6aaeea0b402956b255b3d9a665facde6fb89aed3"} Oct 05 20:48:34 crc kubenswrapper[4753]: I1005 20:48:34.489910 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:48:34 crc kubenswrapper[4753]: I1005 20:48:34.490530 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:48:41 crc kubenswrapper[4753]: I1005 20:48:41.891916 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5ctm8"] Oct 05 20:48:41 crc kubenswrapper[4753]: E1005 20:48:41.892931 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f0158d5-f700-4d7c-a6e9-54f55bfc830c" containerName="collect-profiles" Oct 05 20:48:41 crc kubenswrapper[4753]: I1005 20:48:41.892949 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f0158d5-f700-4d7c-a6e9-54f55bfc830c" containerName="collect-profiles" Oct 05 20:48:41 crc kubenswrapper[4753]: I1005 20:48:41.893395 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f0158d5-f700-4d7c-a6e9-54f55bfc830c" containerName="collect-profiles" Oct 05 20:48:41 crc kubenswrapper[4753]: I1005 20:48:41.894699 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:41 crc kubenswrapper[4753]: I1005 20:48:41.911168 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5ctm8"] Oct 05 20:48:42 crc kubenswrapper[4753]: I1005 20:48:42.056536 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55b5ed6b-8f8c-46a4-8377-04d056195633-catalog-content\") pod \"redhat-marketplace-5ctm8\" (UID: \"55b5ed6b-8f8c-46a4-8377-04d056195633\") " pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:42 crc kubenswrapper[4753]: I1005 20:48:42.056577 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mrbr\" (UniqueName: \"kubernetes.io/projected/55b5ed6b-8f8c-46a4-8377-04d056195633-kube-api-access-6mrbr\") pod \"redhat-marketplace-5ctm8\" (UID: \"55b5ed6b-8f8c-46a4-8377-04d056195633\") " pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:42 crc kubenswrapper[4753]: I1005 20:48:42.056611 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55b5ed6b-8f8c-46a4-8377-04d056195633-utilities\") pod \"redhat-marketplace-5ctm8\" (UID: \"55b5ed6b-8f8c-46a4-8377-04d056195633\") " pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:42 crc kubenswrapper[4753]: I1005 20:48:42.158632 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55b5ed6b-8f8c-46a4-8377-04d056195633-catalog-content\") pod \"redhat-marketplace-5ctm8\" (UID: \"55b5ed6b-8f8c-46a4-8377-04d056195633\") " pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:42 crc kubenswrapper[4753]: I1005 20:48:42.158919 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mrbr\" (UniqueName: \"kubernetes.io/projected/55b5ed6b-8f8c-46a4-8377-04d056195633-kube-api-access-6mrbr\") pod \"redhat-marketplace-5ctm8\" (UID: \"55b5ed6b-8f8c-46a4-8377-04d056195633\") " pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:42 crc kubenswrapper[4753]: I1005 20:48:42.159023 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55b5ed6b-8f8c-46a4-8377-04d056195633-utilities\") pod \"redhat-marketplace-5ctm8\" (UID: \"55b5ed6b-8f8c-46a4-8377-04d056195633\") " pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:42 crc kubenswrapper[4753]: I1005 20:48:42.159150 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55b5ed6b-8f8c-46a4-8377-04d056195633-catalog-content\") pod \"redhat-marketplace-5ctm8\" (UID: \"55b5ed6b-8f8c-46a4-8377-04d056195633\") " pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:42 crc kubenswrapper[4753]: I1005 20:48:42.159570 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55b5ed6b-8f8c-46a4-8377-04d056195633-utilities\") pod \"redhat-marketplace-5ctm8\" (UID: \"55b5ed6b-8f8c-46a4-8377-04d056195633\") " pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:42 crc kubenswrapper[4753]: I1005 20:48:42.179681 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mrbr\" (UniqueName: \"kubernetes.io/projected/55b5ed6b-8f8c-46a4-8377-04d056195633-kube-api-access-6mrbr\") pod \"redhat-marketplace-5ctm8\" (UID: \"55b5ed6b-8f8c-46a4-8377-04d056195633\") " pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:42 crc kubenswrapper[4753]: I1005 20:48:42.258974 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:42 crc kubenswrapper[4753]: I1005 20:48:42.673597 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5ctm8"] Oct 05 20:48:42 crc kubenswrapper[4753]: I1005 20:48:42.938661 4753 generic.go:334] "Generic (PLEG): container finished" podID="55b5ed6b-8f8c-46a4-8377-04d056195633" containerID="b3356e954990759e9650f5e8e8a7464de0b7e61307e52b6e2f7caf3f58988d06" exitCode=0 Oct 05 20:48:42 crc kubenswrapper[4753]: I1005 20:48:42.938999 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ctm8" event={"ID":"55b5ed6b-8f8c-46a4-8377-04d056195633","Type":"ContainerDied","Data":"b3356e954990759e9650f5e8e8a7464de0b7e61307e52b6e2f7caf3f58988d06"} Oct 05 20:48:42 crc kubenswrapper[4753]: I1005 20:48:42.939028 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ctm8" event={"ID":"55b5ed6b-8f8c-46a4-8377-04d056195633","Type":"ContainerStarted","Data":"31fce2a7e2033ce5c30269bbe5354cf6b0ca37f36597380fdafcf29edcdb14ea"} Oct 05 20:48:42 crc kubenswrapper[4753]: I1005 20:48:42.941859 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 05 20:48:43 crc kubenswrapper[4753]: I1005 20:48:43.950079 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ctm8" event={"ID":"55b5ed6b-8f8c-46a4-8377-04d056195633","Type":"ContainerStarted","Data":"3ea6ddd1a48c7dbffc028dee2d354026fff9bfe1f0c7e7b12ea3e7260bbb145a"} Oct 05 20:48:44 crc kubenswrapper[4753]: I1005 20:48:44.960063 4753 generic.go:334] "Generic (PLEG): container finished" podID="55b5ed6b-8f8c-46a4-8377-04d056195633" containerID="3ea6ddd1a48c7dbffc028dee2d354026fff9bfe1f0c7e7b12ea3e7260bbb145a" exitCode=0 Oct 05 20:48:44 crc kubenswrapper[4753]: I1005 20:48:44.960115 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ctm8" event={"ID":"55b5ed6b-8f8c-46a4-8377-04d056195633","Type":"ContainerDied","Data":"3ea6ddd1a48c7dbffc028dee2d354026fff9bfe1f0c7e7b12ea3e7260bbb145a"} Oct 05 20:48:45 crc kubenswrapper[4753]: I1005 20:48:45.972764 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ctm8" event={"ID":"55b5ed6b-8f8c-46a4-8377-04d056195633","Type":"ContainerStarted","Data":"68c9e800b1f20793f5e8a898daa388a79db403ab5c6508cad56be5985d5eb1e7"} Oct 05 20:48:46 crc kubenswrapper[4753]: I1005 20:48:46.004746 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5ctm8" podStartSLOduration=2.47736134 podStartE2EDuration="5.004720468s" podCreationTimestamp="2025-10-05 20:48:41 +0000 UTC" firstStartedPulling="2025-10-05 20:48:42.940710581 +0000 UTC m=+2031.789038823" lastFinishedPulling="2025-10-05 20:48:45.468069709 +0000 UTC m=+2034.316397951" observedRunningTime="2025-10-05 20:48:45.998750103 +0000 UTC m=+2034.847078325" watchObservedRunningTime="2025-10-05 20:48:46.004720468 +0000 UTC m=+2034.853048710" Oct 05 20:48:52 crc kubenswrapper[4753]: I1005 20:48:52.259599 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:52 crc kubenswrapper[4753]: I1005 20:48:52.260708 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:52 crc kubenswrapper[4753]: I1005 20:48:52.322036 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:53 crc kubenswrapper[4753]: I1005 20:48:53.095835 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:53 crc kubenswrapper[4753]: I1005 20:48:53.141630 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5ctm8"] Oct 05 20:48:55 crc kubenswrapper[4753]: I1005 20:48:55.073097 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5ctm8" podUID="55b5ed6b-8f8c-46a4-8377-04d056195633" containerName="registry-server" containerID="cri-o://68c9e800b1f20793f5e8a898daa388a79db403ab5c6508cad56be5985d5eb1e7" gracePeriod=2 Oct 05 20:48:55 crc kubenswrapper[4753]: I1005 20:48:55.523358 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:55 crc kubenswrapper[4753]: I1005 20:48:55.724678 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55b5ed6b-8f8c-46a4-8377-04d056195633-utilities\") pod \"55b5ed6b-8f8c-46a4-8377-04d056195633\" (UID: \"55b5ed6b-8f8c-46a4-8377-04d056195633\") " Oct 05 20:48:55 crc kubenswrapper[4753]: I1005 20:48:55.725118 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mrbr\" (UniqueName: \"kubernetes.io/projected/55b5ed6b-8f8c-46a4-8377-04d056195633-kube-api-access-6mrbr\") pod \"55b5ed6b-8f8c-46a4-8377-04d056195633\" (UID: \"55b5ed6b-8f8c-46a4-8377-04d056195633\") " Oct 05 20:48:55 crc kubenswrapper[4753]: I1005 20:48:55.725245 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55b5ed6b-8f8c-46a4-8377-04d056195633-catalog-content\") pod \"55b5ed6b-8f8c-46a4-8377-04d056195633\" (UID: \"55b5ed6b-8f8c-46a4-8377-04d056195633\") " Oct 05 20:48:55 crc kubenswrapper[4753]: I1005 20:48:55.725894 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55b5ed6b-8f8c-46a4-8377-04d056195633-utilities" (OuterVolumeSpecName: "utilities") pod "55b5ed6b-8f8c-46a4-8377-04d056195633" (UID: "55b5ed6b-8f8c-46a4-8377-04d056195633"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:48:55 crc kubenswrapper[4753]: I1005 20:48:55.736023 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55b5ed6b-8f8c-46a4-8377-04d056195633-kube-api-access-6mrbr" (OuterVolumeSpecName: "kube-api-access-6mrbr") pod "55b5ed6b-8f8c-46a4-8377-04d056195633" (UID: "55b5ed6b-8f8c-46a4-8377-04d056195633"). InnerVolumeSpecName "kube-api-access-6mrbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:48:55 crc kubenswrapper[4753]: I1005 20:48:55.741164 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55b5ed6b-8f8c-46a4-8377-04d056195633-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55b5ed6b-8f8c-46a4-8377-04d056195633" (UID: "55b5ed6b-8f8c-46a4-8377-04d056195633"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:48:55 crc kubenswrapper[4753]: I1005 20:48:55.827116 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55b5ed6b-8f8c-46a4-8377-04d056195633-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:48:55 crc kubenswrapper[4753]: I1005 20:48:55.827320 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mrbr\" (UniqueName: \"kubernetes.io/projected/55b5ed6b-8f8c-46a4-8377-04d056195633-kube-api-access-6mrbr\") on node \"crc\" DevicePath \"\"" Oct 05 20:48:55 crc kubenswrapper[4753]: I1005 20:48:55.827392 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55b5ed6b-8f8c-46a4-8377-04d056195633-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:48:56 crc kubenswrapper[4753]: I1005 20:48:56.089845 4753 generic.go:334] "Generic (PLEG): container finished" podID="55b5ed6b-8f8c-46a4-8377-04d056195633" containerID="68c9e800b1f20793f5e8a898daa388a79db403ab5c6508cad56be5985d5eb1e7" exitCode=0 Oct 05 20:48:56 crc kubenswrapper[4753]: I1005 20:48:56.089896 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ctm8" event={"ID":"55b5ed6b-8f8c-46a4-8377-04d056195633","Type":"ContainerDied","Data":"68c9e800b1f20793f5e8a898daa388a79db403ab5c6508cad56be5985d5eb1e7"} Oct 05 20:48:56 crc kubenswrapper[4753]: I1005 20:48:56.089928 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ctm8" event={"ID":"55b5ed6b-8f8c-46a4-8377-04d056195633","Type":"ContainerDied","Data":"31fce2a7e2033ce5c30269bbe5354cf6b0ca37f36597380fdafcf29edcdb14ea"} Oct 05 20:48:56 crc kubenswrapper[4753]: I1005 20:48:56.089946 4753 scope.go:117] "RemoveContainer" containerID="68c9e800b1f20793f5e8a898daa388a79db403ab5c6508cad56be5985d5eb1e7" Oct 05 20:48:56 crc kubenswrapper[4753]: I1005 20:48:56.089967 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5ctm8" Oct 05 20:48:56 crc kubenswrapper[4753]: I1005 20:48:56.124913 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5ctm8"] Oct 05 20:48:56 crc kubenswrapper[4753]: I1005 20:48:56.131431 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5ctm8"] Oct 05 20:48:56 crc kubenswrapper[4753]: I1005 20:48:56.139644 4753 scope.go:117] "RemoveContainer" containerID="3ea6ddd1a48c7dbffc028dee2d354026fff9bfe1f0c7e7b12ea3e7260bbb145a" Oct 05 20:48:56 crc kubenswrapper[4753]: I1005 20:48:56.181693 4753 scope.go:117] "RemoveContainer" containerID="b3356e954990759e9650f5e8e8a7464de0b7e61307e52b6e2f7caf3f58988d06" Oct 05 20:48:56 crc kubenswrapper[4753]: I1005 20:48:56.229801 4753 scope.go:117] "RemoveContainer" containerID="68c9e800b1f20793f5e8a898daa388a79db403ab5c6508cad56be5985d5eb1e7" Oct 05 20:48:56 crc kubenswrapper[4753]: E1005 20:48:56.230638 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68c9e800b1f20793f5e8a898daa388a79db403ab5c6508cad56be5985d5eb1e7\": container with ID starting with 68c9e800b1f20793f5e8a898daa388a79db403ab5c6508cad56be5985d5eb1e7 not found: ID does not exist" containerID="68c9e800b1f20793f5e8a898daa388a79db403ab5c6508cad56be5985d5eb1e7" Oct 05 20:48:56 crc kubenswrapper[4753]: I1005 20:48:56.230712 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68c9e800b1f20793f5e8a898daa388a79db403ab5c6508cad56be5985d5eb1e7"} err="failed to get container status \"68c9e800b1f20793f5e8a898daa388a79db403ab5c6508cad56be5985d5eb1e7\": rpc error: code = NotFound desc = could not find container \"68c9e800b1f20793f5e8a898daa388a79db403ab5c6508cad56be5985d5eb1e7\": container with ID starting with 68c9e800b1f20793f5e8a898daa388a79db403ab5c6508cad56be5985d5eb1e7 not found: ID does not exist" Oct 05 20:48:56 crc kubenswrapper[4753]: I1005 20:48:56.230764 4753 scope.go:117] "RemoveContainer" containerID="3ea6ddd1a48c7dbffc028dee2d354026fff9bfe1f0c7e7b12ea3e7260bbb145a" Oct 05 20:48:56 crc kubenswrapper[4753]: E1005 20:48:56.231372 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ea6ddd1a48c7dbffc028dee2d354026fff9bfe1f0c7e7b12ea3e7260bbb145a\": container with ID starting with 3ea6ddd1a48c7dbffc028dee2d354026fff9bfe1f0c7e7b12ea3e7260bbb145a not found: ID does not exist" containerID="3ea6ddd1a48c7dbffc028dee2d354026fff9bfe1f0c7e7b12ea3e7260bbb145a" Oct 05 20:48:56 crc kubenswrapper[4753]: I1005 20:48:56.231401 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ea6ddd1a48c7dbffc028dee2d354026fff9bfe1f0c7e7b12ea3e7260bbb145a"} err="failed to get container status \"3ea6ddd1a48c7dbffc028dee2d354026fff9bfe1f0c7e7b12ea3e7260bbb145a\": rpc error: code = NotFound desc = could not find container \"3ea6ddd1a48c7dbffc028dee2d354026fff9bfe1f0c7e7b12ea3e7260bbb145a\": container with ID starting with 3ea6ddd1a48c7dbffc028dee2d354026fff9bfe1f0c7e7b12ea3e7260bbb145a not found: ID does not exist" Oct 05 20:48:56 crc kubenswrapper[4753]: I1005 20:48:56.231419 4753 scope.go:117] "RemoveContainer" containerID="b3356e954990759e9650f5e8e8a7464de0b7e61307e52b6e2f7caf3f58988d06" Oct 05 20:48:56 crc kubenswrapper[4753]: E1005 20:48:56.232078 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3356e954990759e9650f5e8e8a7464de0b7e61307e52b6e2f7caf3f58988d06\": container with ID starting with b3356e954990759e9650f5e8e8a7464de0b7e61307e52b6e2f7caf3f58988d06 not found: ID does not exist" containerID="b3356e954990759e9650f5e8e8a7464de0b7e61307e52b6e2f7caf3f58988d06" Oct 05 20:48:56 crc kubenswrapper[4753]: I1005 20:48:56.232242 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3356e954990759e9650f5e8e8a7464de0b7e61307e52b6e2f7caf3f58988d06"} err="failed to get container status \"b3356e954990759e9650f5e8e8a7464de0b7e61307e52b6e2f7caf3f58988d06\": rpc error: code = NotFound desc = could not find container \"b3356e954990759e9650f5e8e8a7464de0b7e61307e52b6e2f7caf3f58988d06\": container with ID starting with b3356e954990759e9650f5e8e8a7464de0b7e61307e52b6e2f7caf3f58988d06 not found: ID does not exist" Oct 05 20:48:57 crc kubenswrapper[4753]: I1005 20:48:57.877097 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55b5ed6b-8f8c-46a4-8377-04d056195633" path="/var/lib/kubelet/pods/55b5ed6b-8f8c-46a4-8377-04d056195633/volumes" Oct 05 20:49:04 crc kubenswrapper[4753]: I1005 20:49:04.490330 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:49:04 crc kubenswrapper[4753]: I1005 20:49:04.491735 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:49:34 crc kubenswrapper[4753]: I1005 20:49:34.490251 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:49:34 crc kubenswrapper[4753]: I1005 20:49:34.490876 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:49:34 crc kubenswrapper[4753]: I1005 20:49:34.490945 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:49:34 crc kubenswrapper[4753]: I1005 20:49:34.492015 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5952695b2bbac0c6b13de72d6aaeea0b402956b255b3d9a665facde6fb89aed3"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 20:49:34 crc kubenswrapper[4753]: I1005 20:49:34.492113 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://5952695b2bbac0c6b13de72d6aaeea0b402956b255b3d9a665facde6fb89aed3" gracePeriod=600 Oct 05 20:49:35 crc kubenswrapper[4753]: I1005 20:49:35.437339 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="5952695b2bbac0c6b13de72d6aaeea0b402956b255b3d9a665facde6fb89aed3" exitCode=0 Oct 05 20:49:35 crc kubenswrapper[4753]: I1005 20:49:35.437391 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"5952695b2bbac0c6b13de72d6aaeea0b402956b255b3d9a665facde6fb89aed3"} Oct 05 20:49:35 crc kubenswrapper[4753]: I1005 20:49:35.437889 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41"} Oct 05 20:49:35 crc kubenswrapper[4753]: I1005 20:49:35.437907 4753 scope.go:117] "RemoveContainer" containerID="f7f152e5f366a57669a015bab8b5e2ae33b1b003f0251c70c5ffbe0b9680e23a" Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.267344 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k2m22"] Oct 05 20:49:57 crc kubenswrapper[4753]: E1005 20:49:57.268235 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55b5ed6b-8f8c-46a4-8377-04d056195633" containerName="registry-server" Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.268247 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="55b5ed6b-8f8c-46a4-8377-04d056195633" containerName="registry-server" Oct 05 20:49:57 crc kubenswrapper[4753]: E1005 20:49:57.268276 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55b5ed6b-8f8c-46a4-8377-04d056195633" containerName="extract-content" Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.268282 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="55b5ed6b-8f8c-46a4-8377-04d056195633" containerName="extract-content" Oct 05 20:49:57 crc kubenswrapper[4753]: E1005 20:49:57.268297 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55b5ed6b-8f8c-46a4-8377-04d056195633" containerName="extract-utilities" Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.268304 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="55b5ed6b-8f8c-46a4-8377-04d056195633" containerName="extract-utilities" Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.268505 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="55b5ed6b-8f8c-46a4-8377-04d056195633" containerName="registry-server" Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.269870 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.279059 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k2m22"] Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.414242 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbtcc\" (UniqueName: \"kubernetes.io/projected/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-kube-api-access-nbtcc\") pod \"redhat-operators-k2m22\" (UID: \"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc\") " pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.414289 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-catalog-content\") pod \"redhat-operators-k2m22\" (UID: \"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc\") " pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.414682 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-utilities\") pod \"redhat-operators-k2m22\" (UID: \"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc\") " pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.516525 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-utilities\") pod \"redhat-operators-k2m22\" (UID: \"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc\") " pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.516646 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbtcc\" (UniqueName: \"kubernetes.io/projected/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-kube-api-access-nbtcc\") pod \"redhat-operators-k2m22\" (UID: \"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc\") " pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.516687 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-catalog-content\") pod \"redhat-operators-k2m22\" (UID: \"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc\") " pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.517018 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-utilities\") pod \"redhat-operators-k2m22\" (UID: \"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc\") " pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.517208 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-catalog-content\") pod \"redhat-operators-k2m22\" (UID: \"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc\") " pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.534556 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbtcc\" (UniqueName: \"kubernetes.io/projected/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-kube-api-access-nbtcc\") pod \"redhat-operators-k2m22\" (UID: \"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc\") " pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:49:57 crc kubenswrapper[4753]: I1005 20:49:57.588343 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:49:58 crc kubenswrapper[4753]: I1005 20:49:58.068521 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k2m22"] Oct 05 20:49:58 crc kubenswrapper[4753]: I1005 20:49:58.629378 4753 generic.go:334] "Generic (PLEG): container finished" podID="ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" containerID="cc7f5f4bb57eada1618100ce1ac50bbeefd8da492b51b94e31b85fbdf71d6279" exitCode=0 Oct 05 20:49:58 crc kubenswrapper[4753]: I1005 20:49:58.629460 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2m22" event={"ID":"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc","Type":"ContainerDied","Data":"cc7f5f4bb57eada1618100ce1ac50bbeefd8da492b51b94e31b85fbdf71d6279"} Oct 05 20:49:58 crc kubenswrapper[4753]: I1005 20:49:58.630825 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2m22" event={"ID":"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc","Type":"ContainerStarted","Data":"e1c971a97fda8d9e29757ee7f0d636a99cd91bf49e019ac965ad5dc70bb22ae4"} Oct 05 20:50:00 crc kubenswrapper[4753]: I1005 20:50:00.661425 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2m22" event={"ID":"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc","Type":"ContainerStarted","Data":"7bd8cb9641a0a14a947b219a212bb7c93c4d89ebdd9508149493cc99d7ff6c5b"} Oct 05 20:50:02 crc kubenswrapper[4753]: E1005 20:50:02.833602 4753 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffa2a0a3_d1b8_4ff6_b9f0_acd0d98566fc.slice/crio-conmon-7bd8cb9641a0a14a947b219a212bb7c93c4d89ebdd9508149493cc99d7ff6c5b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffa2a0a3_d1b8_4ff6_b9f0_acd0d98566fc.slice/crio-7bd8cb9641a0a14a947b219a212bb7c93c4d89ebdd9508149493cc99d7ff6c5b.scope\": RecentStats: unable to find data in memory cache]" Oct 05 20:50:03 crc kubenswrapper[4753]: I1005 20:50:03.686866 4753 generic.go:334] "Generic (PLEG): container finished" podID="ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" containerID="7bd8cb9641a0a14a947b219a212bb7c93c4d89ebdd9508149493cc99d7ff6c5b" exitCode=0 Oct 05 20:50:03 crc kubenswrapper[4753]: I1005 20:50:03.686980 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2m22" event={"ID":"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc","Type":"ContainerDied","Data":"7bd8cb9641a0a14a947b219a212bb7c93c4d89ebdd9508149493cc99d7ff6c5b"} Oct 05 20:50:04 crc kubenswrapper[4753]: I1005 20:50:04.698579 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2m22" event={"ID":"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc","Type":"ContainerStarted","Data":"463911db5e5bf7b05b2776525a83b90dac8f4a714c32c13530e80be0b1761c36"} Oct 05 20:50:04 crc kubenswrapper[4753]: I1005 20:50:04.723668 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k2m22" podStartSLOduration=2.285770268 podStartE2EDuration="7.72364786s" podCreationTimestamp="2025-10-05 20:49:57 +0000 UTC" firstStartedPulling="2025-10-05 20:49:58.631413175 +0000 UTC m=+2107.479741407" lastFinishedPulling="2025-10-05 20:50:04.069290767 +0000 UTC m=+2112.917618999" observedRunningTime="2025-10-05 20:50:04.714524947 +0000 UTC m=+2113.562853199" watchObservedRunningTime="2025-10-05 20:50:04.72364786 +0000 UTC m=+2113.571976082" Oct 05 20:50:05 crc kubenswrapper[4753]: I1005 20:50:05.568752 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d5bf2"] Oct 05 20:50:05 crc kubenswrapper[4753]: I1005 20:50:05.570653 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:05 crc kubenswrapper[4753]: I1005 20:50:05.592700 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d5bf2"] Oct 05 20:50:05 crc kubenswrapper[4753]: I1005 20:50:05.678772 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/619a35fc-6a65-42cb-a0a0-9e2605111a42-utilities\") pod \"community-operators-d5bf2\" (UID: \"619a35fc-6a65-42cb-a0a0-9e2605111a42\") " pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:05 crc kubenswrapper[4753]: I1005 20:50:05.678848 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7n6c\" (UniqueName: \"kubernetes.io/projected/619a35fc-6a65-42cb-a0a0-9e2605111a42-kube-api-access-f7n6c\") pod \"community-operators-d5bf2\" (UID: \"619a35fc-6a65-42cb-a0a0-9e2605111a42\") " pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:05 crc kubenswrapper[4753]: I1005 20:50:05.679477 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/619a35fc-6a65-42cb-a0a0-9e2605111a42-catalog-content\") pod \"community-operators-d5bf2\" (UID: \"619a35fc-6a65-42cb-a0a0-9e2605111a42\") " pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:05 crc kubenswrapper[4753]: I1005 20:50:05.780988 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/619a35fc-6a65-42cb-a0a0-9e2605111a42-catalog-content\") pod \"community-operators-d5bf2\" (UID: \"619a35fc-6a65-42cb-a0a0-9e2605111a42\") " pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:05 crc kubenswrapper[4753]: I1005 20:50:05.781044 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/619a35fc-6a65-42cb-a0a0-9e2605111a42-utilities\") pod \"community-operators-d5bf2\" (UID: \"619a35fc-6a65-42cb-a0a0-9e2605111a42\") " pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:05 crc kubenswrapper[4753]: I1005 20:50:05.781085 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7n6c\" (UniqueName: \"kubernetes.io/projected/619a35fc-6a65-42cb-a0a0-9e2605111a42-kube-api-access-f7n6c\") pod \"community-operators-d5bf2\" (UID: \"619a35fc-6a65-42cb-a0a0-9e2605111a42\") " pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:05 crc kubenswrapper[4753]: I1005 20:50:05.781942 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/619a35fc-6a65-42cb-a0a0-9e2605111a42-catalog-content\") pod \"community-operators-d5bf2\" (UID: \"619a35fc-6a65-42cb-a0a0-9e2605111a42\") " pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:05 crc kubenswrapper[4753]: I1005 20:50:05.782213 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/619a35fc-6a65-42cb-a0a0-9e2605111a42-utilities\") pod \"community-operators-d5bf2\" (UID: \"619a35fc-6a65-42cb-a0a0-9e2605111a42\") " pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:05 crc kubenswrapper[4753]: I1005 20:50:05.801001 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7n6c\" (UniqueName: \"kubernetes.io/projected/619a35fc-6a65-42cb-a0a0-9e2605111a42-kube-api-access-f7n6c\") pod \"community-operators-d5bf2\" (UID: \"619a35fc-6a65-42cb-a0a0-9e2605111a42\") " pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:05 crc kubenswrapper[4753]: I1005 20:50:05.893625 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:06 crc kubenswrapper[4753]: I1005 20:50:06.482368 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d5bf2"] Oct 05 20:50:06 crc kubenswrapper[4753]: W1005 20:50:06.488640 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod619a35fc_6a65_42cb_a0a0_9e2605111a42.slice/crio-1eb38e3bd9a46aa92633564428feb179878d98f6c85689271134f64ef41a56be WatchSource:0}: Error finding container 1eb38e3bd9a46aa92633564428feb179878d98f6c85689271134f64ef41a56be: Status 404 returned error can't find the container with id 1eb38e3bd9a46aa92633564428feb179878d98f6c85689271134f64ef41a56be Oct 05 20:50:06 crc kubenswrapper[4753]: I1005 20:50:06.715581 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5bf2" event={"ID":"619a35fc-6a65-42cb-a0a0-9e2605111a42","Type":"ContainerStarted","Data":"cfbe92eb6bab3e254cbe045cc862ff4807ba0420dc59457619bca67b75345799"} Oct 05 20:50:06 crc kubenswrapper[4753]: I1005 20:50:06.715971 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5bf2" event={"ID":"619a35fc-6a65-42cb-a0a0-9e2605111a42","Type":"ContainerStarted","Data":"1eb38e3bd9a46aa92633564428feb179878d98f6c85689271134f64ef41a56be"} Oct 05 20:50:07 crc kubenswrapper[4753]: I1005 20:50:07.589407 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:50:07 crc kubenswrapper[4753]: I1005 20:50:07.589467 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:50:07 crc kubenswrapper[4753]: I1005 20:50:07.738071 4753 generic.go:334] "Generic (PLEG): container finished" podID="619a35fc-6a65-42cb-a0a0-9e2605111a42" containerID="cfbe92eb6bab3e254cbe045cc862ff4807ba0420dc59457619bca67b75345799" exitCode=0 Oct 05 20:50:07 crc kubenswrapper[4753]: I1005 20:50:07.738252 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5bf2" event={"ID":"619a35fc-6a65-42cb-a0a0-9e2605111a42","Type":"ContainerDied","Data":"cfbe92eb6bab3e254cbe045cc862ff4807ba0420dc59457619bca67b75345799"} Oct 05 20:50:08 crc kubenswrapper[4753]: I1005 20:50:08.641595 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k2m22" podUID="ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" containerName="registry-server" probeResult="failure" output=< Oct 05 20:50:08 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 20:50:08 crc kubenswrapper[4753]: > Oct 05 20:50:08 crc kubenswrapper[4753]: I1005 20:50:08.752084 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5bf2" event={"ID":"619a35fc-6a65-42cb-a0a0-9e2605111a42","Type":"ContainerStarted","Data":"b4236aee222114f829fbc46e5ca038c9c2b1df2d62f770ce46a467b0b51c9bbf"} Oct 05 20:50:10 crc kubenswrapper[4753]: I1005 20:50:10.770705 4753 generic.go:334] "Generic (PLEG): container finished" podID="619a35fc-6a65-42cb-a0a0-9e2605111a42" containerID="b4236aee222114f829fbc46e5ca038c9c2b1df2d62f770ce46a467b0b51c9bbf" exitCode=0 Oct 05 20:50:10 crc kubenswrapper[4753]: I1005 20:50:10.770880 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5bf2" event={"ID":"619a35fc-6a65-42cb-a0a0-9e2605111a42","Type":"ContainerDied","Data":"b4236aee222114f829fbc46e5ca038c9c2b1df2d62f770ce46a467b0b51c9bbf"} Oct 05 20:50:11 crc kubenswrapper[4753]: I1005 20:50:11.779912 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5bf2" event={"ID":"619a35fc-6a65-42cb-a0a0-9e2605111a42","Type":"ContainerStarted","Data":"4bbe31c49c04510c6975607786f4da519129dcaa4e67cf81fa9444b086851eec"} Oct 05 20:50:11 crc kubenswrapper[4753]: I1005 20:50:11.801516 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d5bf2" podStartSLOduration=3.36399808 podStartE2EDuration="6.801494061s" podCreationTimestamp="2025-10-05 20:50:05 +0000 UTC" firstStartedPulling="2025-10-05 20:50:07.740171764 +0000 UTC m=+2116.588499996" lastFinishedPulling="2025-10-05 20:50:11.177667745 +0000 UTC m=+2120.025995977" observedRunningTime="2025-10-05 20:50:11.796028222 +0000 UTC m=+2120.644356464" watchObservedRunningTime="2025-10-05 20:50:11.801494061 +0000 UTC m=+2120.649822293" Oct 05 20:50:15 crc kubenswrapper[4753]: I1005 20:50:15.894562 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:15 crc kubenswrapper[4753]: I1005 20:50:15.895664 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:16 crc kubenswrapper[4753]: I1005 20:50:16.942446 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-d5bf2" podUID="619a35fc-6a65-42cb-a0a0-9e2605111a42" containerName="registry-server" probeResult="failure" output=< Oct 05 20:50:16 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 20:50:16 crc kubenswrapper[4753]: > Oct 05 20:50:18 crc kubenswrapper[4753]: I1005 20:50:18.635585 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k2m22" podUID="ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" containerName="registry-server" probeResult="failure" output=< Oct 05 20:50:18 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 20:50:18 crc kubenswrapper[4753]: > Oct 05 20:50:25 crc kubenswrapper[4753]: I1005 20:50:25.949853 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:26 crc kubenswrapper[4753]: I1005 20:50:26.010471 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:26 crc kubenswrapper[4753]: I1005 20:50:26.201625 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d5bf2"] Oct 05 20:50:27 crc kubenswrapper[4753]: I1005 20:50:27.916460 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d5bf2" podUID="619a35fc-6a65-42cb-a0a0-9e2605111a42" containerName="registry-server" containerID="cri-o://4bbe31c49c04510c6975607786f4da519129dcaa4e67cf81fa9444b086851eec" gracePeriod=2 Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.426215 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.607036 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/619a35fc-6a65-42cb-a0a0-9e2605111a42-utilities\") pod \"619a35fc-6a65-42cb-a0a0-9e2605111a42\" (UID: \"619a35fc-6a65-42cb-a0a0-9e2605111a42\") " Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.607507 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7n6c\" (UniqueName: \"kubernetes.io/projected/619a35fc-6a65-42cb-a0a0-9e2605111a42-kube-api-access-f7n6c\") pod \"619a35fc-6a65-42cb-a0a0-9e2605111a42\" (UID: \"619a35fc-6a65-42cb-a0a0-9e2605111a42\") " Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.607544 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/619a35fc-6a65-42cb-a0a0-9e2605111a42-catalog-content\") pod \"619a35fc-6a65-42cb-a0a0-9e2605111a42\" (UID: \"619a35fc-6a65-42cb-a0a0-9e2605111a42\") " Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.612238 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/619a35fc-6a65-42cb-a0a0-9e2605111a42-utilities" (OuterVolumeSpecName: "utilities") pod "619a35fc-6a65-42cb-a0a0-9e2605111a42" (UID: "619a35fc-6a65-42cb-a0a0-9e2605111a42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.623816 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/619a35fc-6a65-42cb-a0a0-9e2605111a42-kube-api-access-f7n6c" (OuterVolumeSpecName: "kube-api-access-f7n6c") pod "619a35fc-6a65-42cb-a0a0-9e2605111a42" (UID: "619a35fc-6a65-42cb-a0a0-9e2605111a42"). InnerVolumeSpecName "kube-api-access-f7n6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.640683 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k2m22" podUID="ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" containerName="registry-server" probeResult="failure" output=< Oct 05 20:50:28 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 20:50:28 crc kubenswrapper[4753]: > Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.676792 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/619a35fc-6a65-42cb-a0a0-9e2605111a42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "619a35fc-6a65-42cb-a0a0-9e2605111a42" (UID: "619a35fc-6a65-42cb-a0a0-9e2605111a42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.709326 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/619a35fc-6a65-42cb-a0a0-9e2605111a42-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.709363 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7n6c\" (UniqueName: \"kubernetes.io/projected/619a35fc-6a65-42cb-a0a0-9e2605111a42-kube-api-access-f7n6c\") on node \"crc\" DevicePath \"\"" Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.709378 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/619a35fc-6a65-42cb-a0a0-9e2605111a42-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.927465 4753 generic.go:334] "Generic (PLEG): container finished" podID="619a35fc-6a65-42cb-a0a0-9e2605111a42" containerID="4bbe31c49c04510c6975607786f4da519129dcaa4e67cf81fa9444b086851eec" exitCode=0 Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.927517 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5bf2" event={"ID":"619a35fc-6a65-42cb-a0a0-9e2605111a42","Type":"ContainerDied","Data":"4bbe31c49c04510c6975607786f4da519129dcaa4e67cf81fa9444b086851eec"} Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.927550 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5bf2" event={"ID":"619a35fc-6a65-42cb-a0a0-9e2605111a42","Type":"ContainerDied","Data":"1eb38e3bd9a46aa92633564428feb179878d98f6c85689271134f64ef41a56be"} Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.927569 4753 scope.go:117] "RemoveContainer" containerID="4bbe31c49c04510c6975607786f4da519129dcaa4e67cf81fa9444b086851eec" Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.927575 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5bf2" Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.962960 4753 scope.go:117] "RemoveContainer" containerID="b4236aee222114f829fbc46e5ca038c9c2b1df2d62f770ce46a467b0b51c9bbf" Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.969821 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d5bf2"] Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.980812 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d5bf2"] Oct 05 20:50:28 crc kubenswrapper[4753]: I1005 20:50:28.996183 4753 scope.go:117] "RemoveContainer" containerID="cfbe92eb6bab3e254cbe045cc862ff4807ba0420dc59457619bca67b75345799" Oct 05 20:50:29 crc kubenswrapper[4753]: I1005 20:50:29.055408 4753 scope.go:117] "RemoveContainer" containerID="4bbe31c49c04510c6975607786f4da519129dcaa4e67cf81fa9444b086851eec" Oct 05 20:50:29 crc kubenswrapper[4753]: E1005 20:50:29.056214 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bbe31c49c04510c6975607786f4da519129dcaa4e67cf81fa9444b086851eec\": container with ID starting with 4bbe31c49c04510c6975607786f4da519129dcaa4e67cf81fa9444b086851eec not found: ID does not exist" containerID="4bbe31c49c04510c6975607786f4da519129dcaa4e67cf81fa9444b086851eec" Oct 05 20:50:29 crc kubenswrapper[4753]: I1005 20:50:29.056281 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bbe31c49c04510c6975607786f4da519129dcaa4e67cf81fa9444b086851eec"} err="failed to get container status \"4bbe31c49c04510c6975607786f4da519129dcaa4e67cf81fa9444b086851eec\": rpc error: code = NotFound desc = could not find container \"4bbe31c49c04510c6975607786f4da519129dcaa4e67cf81fa9444b086851eec\": container with ID starting with 4bbe31c49c04510c6975607786f4da519129dcaa4e67cf81fa9444b086851eec not found: ID does not exist" Oct 05 20:50:29 crc kubenswrapper[4753]: I1005 20:50:29.056334 4753 scope.go:117] "RemoveContainer" containerID="b4236aee222114f829fbc46e5ca038c9c2b1df2d62f770ce46a467b0b51c9bbf" Oct 05 20:50:29 crc kubenswrapper[4753]: E1005 20:50:29.056987 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4236aee222114f829fbc46e5ca038c9c2b1df2d62f770ce46a467b0b51c9bbf\": container with ID starting with b4236aee222114f829fbc46e5ca038c9c2b1df2d62f770ce46a467b0b51c9bbf not found: ID does not exist" containerID="b4236aee222114f829fbc46e5ca038c9c2b1df2d62f770ce46a467b0b51c9bbf" Oct 05 20:50:29 crc kubenswrapper[4753]: I1005 20:50:29.057021 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4236aee222114f829fbc46e5ca038c9c2b1df2d62f770ce46a467b0b51c9bbf"} err="failed to get container status \"b4236aee222114f829fbc46e5ca038c9c2b1df2d62f770ce46a467b0b51c9bbf\": rpc error: code = NotFound desc = could not find container \"b4236aee222114f829fbc46e5ca038c9c2b1df2d62f770ce46a467b0b51c9bbf\": container with ID starting with b4236aee222114f829fbc46e5ca038c9c2b1df2d62f770ce46a467b0b51c9bbf not found: ID does not exist" Oct 05 20:50:29 crc kubenswrapper[4753]: I1005 20:50:29.057038 4753 scope.go:117] "RemoveContainer" containerID="cfbe92eb6bab3e254cbe045cc862ff4807ba0420dc59457619bca67b75345799" Oct 05 20:50:29 crc kubenswrapper[4753]: E1005 20:50:29.057524 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfbe92eb6bab3e254cbe045cc862ff4807ba0420dc59457619bca67b75345799\": container with ID starting with cfbe92eb6bab3e254cbe045cc862ff4807ba0420dc59457619bca67b75345799 not found: ID does not exist" containerID="cfbe92eb6bab3e254cbe045cc862ff4807ba0420dc59457619bca67b75345799" Oct 05 20:50:29 crc kubenswrapper[4753]: I1005 20:50:29.057574 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfbe92eb6bab3e254cbe045cc862ff4807ba0420dc59457619bca67b75345799"} err="failed to get container status \"cfbe92eb6bab3e254cbe045cc862ff4807ba0420dc59457619bca67b75345799\": rpc error: code = NotFound desc = could not find container \"cfbe92eb6bab3e254cbe045cc862ff4807ba0420dc59457619bca67b75345799\": container with ID starting with cfbe92eb6bab3e254cbe045cc862ff4807ba0420dc59457619bca67b75345799 not found: ID does not exist" Oct 05 20:50:29 crc kubenswrapper[4753]: I1005 20:50:29.871393 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="619a35fc-6a65-42cb-a0a0-9e2605111a42" path="/var/lib/kubelet/pods/619a35fc-6a65-42cb-a0a0-9e2605111a42/volumes" Oct 05 20:50:37 crc kubenswrapper[4753]: I1005 20:50:37.646641 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:50:37 crc kubenswrapper[4753]: I1005 20:50:37.705884 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:50:37 crc kubenswrapper[4753]: I1005 20:50:37.887960 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k2m22"] Oct 05 20:50:39 crc kubenswrapper[4753]: I1005 20:50:39.032759 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k2m22" podUID="ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" containerName="registry-server" containerID="cri-o://463911db5e5bf7b05b2776525a83b90dac8f4a714c32c13530e80be0b1761c36" gracePeriod=2 Oct 05 20:50:39 crc kubenswrapper[4753]: I1005 20:50:39.459455 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:50:39 crc kubenswrapper[4753]: I1005 20:50:39.610177 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-catalog-content\") pod \"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc\" (UID: \"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc\") " Oct 05 20:50:39 crc kubenswrapper[4753]: I1005 20:50:39.610324 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbtcc\" (UniqueName: \"kubernetes.io/projected/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-kube-api-access-nbtcc\") pod \"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc\" (UID: \"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc\") " Oct 05 20:50:39 crc kubenswrapper[4753]: I1005 20:50:39.610376 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-utilities\") pod \"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc\" (UID: \"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc\") " Oct 05 20:50:39 crc kubenswrapper[4753]: I1005 20:50:39.611388 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-utilities" (OuterVolumeSpecName: "utilities") pod "ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" (UID: "ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:50:39 crc kubenswrapper[4753]: I1005 20:50:39.616621 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-kube-api-access-nbtcc" (OuterVolumeSpecName: "kube-api-access-nbtcc") pod "ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" (UID: "ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc"). InnerVolumeSpecName "kube-api-access-nbtcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:50:39 crc kubenswrapper[4753]: I1005 20:50:39.698915 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" (UID: "ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:50:39 crc kubenswrapper[4753]: I1005 20:50:39.718529 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbtcc\" (UniqueName: \"kubernetes.io/projected/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-kube-api-access-nbtcc\") on node \"crc\" DevicePath \"\"" Oct 05 20:50:39 crc kubenswrapper[4753]: I1005 20:50:39.718569 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:50:39 crc kubenswrapper[4753]: I1005 20:50:39.718581 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:50:40 crc kubenswrapper[4753]: I1005 20:50:40.043466 4753 generic.go:334] "Generic (PLEG): container finished" podID="ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" containerID="463911db5e5bf7b05b2776525a83b90dac8f4a714c32c13530e80be0b1761c36" exitCode=0 Oct 05 20:50:40 crc kubenswrapper[4753]: I1005 20:50:40.043490 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2m22" Oct 05 20:50:40 crc kubenswrapper[4753]: I1005 20:50:40.043504 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2m22" event={"ID":"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc","Type":"ContainerDied","Data":"463911db5e5bf7b05b2776525a83b90dac8f4a714c32c13530e80be0b1761c36"} Oct 05 20:50:40 crc kubenswrapper[4753]: I1005 20:50:40.043689 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2m22" event={"ID":"ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc","Type":"ContainerDied","Data":"e1c971a97fda8d9e29757ee7f0d636a99cd91bf49e019ac965ad5dc70bb22ae4"} Oct 05 20:50:40 crc kubenswrapper[4753]: I1005 20:50:40.043710 4753 scope.go:117] "RemoveContainer" containerID="463911db5e5bf7b05b2776525a83b90dac8f4a714c32c13530e80be0b1761c36" Oct 05 20:50:40 crc kubenswrapper[4753]: I1005 20:50:40.078067 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k2m22"] Oct 05 20:50:40 crc kubenswrapper[4753]: I1005 20:50:40.080402 4753 scope.go:117] "RemoveContainer" containerID="7bd8cb9641a0a14a947b219a212bb7c93c4d89ebdd9508149493cc99d7ff6c5b" Oct 05 20:50:40 crc kubenswrapper[4753]: I1005 20:50:40.090426 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k2m22"] Oct 05 20:50:40 crc kubenswrapper[4753]: I1005 20:50:40.110297 4753 scope.go:117] "RemoveContainer" containerID="cc7f5f4bb57eada1618100ce1ac50bbeefd8da492b51b94e31b85fbdf71d6279" Oct 05 20:50:40 crc kubenswrapper[4753]: I1005 20:50:40.144776 4753 scope.go:117] "RemoveContainer" containerID="463911db5e5bf7b05b2776525a83b90dac8f4a714c32c13530e80be0b1761c36" Oct 05 20:50:40 crc kubenswrapper[4753]: E1005 20:50:40.145351 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"463911db5e5bf7b05b2776525a83b90dac8f4a714c32c13530e80be0b1761c36\": container with ID starting with 463911db5e5bf7b05b2776525a83b90dac8f4a714c32c13530e80be0b1761c36 not found: ID does not exist" containerID="463911db5e5bf7b05b2776525a83b90dac8f4a714c32c13530e80be0b1761c36" Oct 05 20:50:40 crc kubenswrapper[4753]: I1005 20:50:40.145480 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"463911db5e5bf7b05b2776525a83b90dac8f4a714c32c13530e80be0b1761c36"} err="failed to get container status \"463911db5e5bf7b05b2776525a83b90dac8f4a714c32c13530e80be0b1761c36\": rpc error: code = NotFound desc = could not find container \"463911db5e5bf7b05b2776525a83b90dac8f4a714c32c13530e80be0b1761c36\": container with ID starting with 463911db5e5bf7b05b2776525a83b90dac8f4a714c32c13530e80be0b1761c36 not found: ID does not exist" Oct 05 20:50:40 crc kubenswrapper[4753]: I1005 20:50:40.145525 4753 scope.go:117] "RemoveContainer" containerID="7bd8cb9641a0a14a947b219a212bb7c93c4d89ebdd9508149493cc99d7ff6c5b" Oct 05 20:50:40 crc kubenswrapper[4753]: E1005 20:50:40.145894 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd8cb9641a0a14a947b219a212bb7c93c4d89ebdd9508149493cc99d7ff6c5b\": container with ID starting with 7bd8cb9641a0a14a947b219a212bb7c93c4d89ebdd9508149493cc99d7ff6c5b not found: ID does not exist" containerID="7bd8cb9641a0a14a947b219a212bb7c93c4d89ebdd9508149493cc99d7ff6c5b" Oct 05 20:50:40 crc kubenswrapper[4753]: I1005 20:50:40.145948 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd8cb9641a0a14a947b219a212bb7c93c4d89ebdd9508149493cc99d7ff6c5b"} err="failed to get container status \"7bd8cb9641a0a14a947b219a212bb7c93c4d89ebdd9508149493cc99d7ff6c5b\": rpc error: code = NotFound desc = could not find container \"7bd8cb9641a0a14a947b219a212bb7c93c4d89ebdd9508149493cc99d7ff6c5b\": container with ID starting with 7bd8cb9641a0a14a947b219a212bb7c93c4d89ebdd9508149493cc99d7ff6c5b not found: ID does not exist" Oct 05 20:50:40 crc kubenswrapper[4753]: I1005 20:50:40.145969 4753 scope.go:117] "RemoveContainer" containerID="cc7f5f4bb57eada1618100ce1ac50bbeefd8da492b51b94e31b85fbdf71d6279" Oct 05 20:50:40 crc kubenswrapper[4753]: E1005 20:50:40.146231 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc7f5f4bb57eada1618100ce1ac50bbeefd8da492b51b94e31b85fbdf71d6279\": container with ID starting with cc7f5f4bb57eada1618100ce1ac50bbeefd8da492b51b94e31b85fbdf71d6279 not found: ID does not exist" containerID="cc7f5f4bb57eada1618100ce1ac50bbeefd8da492b51b94e31b85fbdf71d6279" Oct 05 20:50:40 crc kubenswrapper[4753]: I1005 20:50:40.146253 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc7f5f4bb57eada1618100ce1ac50bbeefd8da492b51b94e31b85fbdf71d6279"} err="failed to get container status \"cc7f5f4bb57eada1618100ce1ac50bbeefd8da492b51b94e31b85fbdf71d6279\": rpc error: code = NotFound desc = could not find container \"cc7f5f4bb57eada1618100ce1ac50bbeefd8da492b51b94e31b85fbdf71d6279\": container with ID starting with cc7f5f4bb57eada1618100ce1ac50bbeefd8da492b51b94e31b85fbdf71d6279 not found: ID does not exist" Oct 05 20:50:41 crc kubenswrapper[4753]: I1005 20:50:41.876471 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" path="/var/lib/kubelet/pods/ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc/volumes" Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.496961 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-wr58b"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.510459 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.519072 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.526207 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.533372 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.553311 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.555439 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.565076 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.572131 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wtkj2"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.579105 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-dtlzn"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.584904 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.590027 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wqfx4"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.595317 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-stbtf"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.600424 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vf8ll"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.605701 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8b89"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.612339 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.619811 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.625720 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v7p24"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.630936 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-wr58b"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.635806 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgnl5"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.640581 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-br66s"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.645289 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-dtwmh"] Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.865368 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03feab36-54ee-4968-a9d9-841cbd059c48" path="/var/lib/kubelet/pods/03feab36-54ee-4968-a9d9-841cbd059c48/volumes" Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.866534 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d82b913-d184-4394-8ec8-bc43006d5c38" path="/var/lib/kubelet/pods/0d82b913-d184-4394-8ec8-bc43006d5c38/volumes" Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.867601 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14e64073-f43f-4db5-9c64-78f928a2e022" path="/var/lib/kubelet/pods/14e64073-f43f-4db5-9c64-78f928a2e022/volumes" Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.868687 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4603cbff-4678-463a-b7bb-187826d1717c" path="/var/lib/kubelet/pods/4603cbff-4678-463a-b7bb-187826d1717c/volumes" Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.870996 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63b10f94-2d2e-402a-8f62-92c7e33eef92" path="/var/lib/kubelet/pods/63b10f94-2d2e-402a-8f62-92c7e33eef92/volumes" Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.872050 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a01b0b7-7dab-4020-811f-55c4d0f5b45d" path="/var/lib/kubelet/pods/7a01b0b7-7dab-4020-811f-55c4d0f5b45d/volumes" Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.874644 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d65339d-58d4-4124-9e26-1997be504d6a" path="/var/lib/kubelet/pods/7d65339d-58d4-4124-9e26-1997be504d6a/volumes" Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.875827 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b8eb5c9-8b94-486b-b8e2-f3ff1b889928" path="/var/lib/kubelet/pods/8b8eb5c9-8b94-486b-b8e2-f3ff1b889928/volumes" Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.877244 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf2a62ed-7175-456e-953b-551c3148ba91" path="/var/lib/kubelet/pods/bf2a62ed-7175-456e-953b-551c3148ba91/volumes" Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.879628 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0b6d15a-036b-4a52-b581-c5ebed291529" path="/var/lib/kubelet/pods/d0b6d15a-036b-4a52-b581-c5ebed291529/volumes" Oct 05 20:50:45 crc kubenswrapper[4753]: I1005 20:50:45.881363 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9296ffc-298c-49ec-a282-e8906b12ef70" path="/var/lib/kubelet/pods/e9296ffc-298c-49ec-a282-e8906b12ef70/volumes" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.179231 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk"] Oct 05 20:50:59 crc kubenswrapper[4753]: E1005 20:50:59.180246 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619a35fc-6a65-42cb-a0a0-9e2605111a42" containerName="extract-utilities" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.180263 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="619a35fc-6a65-42cb-a0a0-9e2605111a42" containerName="extract-utilities" Oct 05 20:50:59 crc kubenswrapper[4753]: E1005 20:50:59.180283 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619a35fc-6a65-42cb-a0a0-9e2605111a42" containerName="registry-server" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.180292 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="619a35fc-6a65-42cb-a0a0-9e2605111a42" containerName="registry-server" Oct 05 20:50:59 crc kubenswrapper[4753]: E1005 20:50:59.180306 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" containerName="extract-content" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.180314 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" containerName="extract-content" Oct 05 20:50:59 crc kubenswrapper[4753]: E1005 20:50:59.180348 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" containerName="registry-server" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.180355 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" containerName="registry-server" Oct 05 20:50:59 crc kubenswrapper[4753]: E1005 20:50:59.180368 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619a35fc-6a65-42cb-a0a0-9e2605111a42" containerName="extract-content" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.180375 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="619a35fc-6a65-42cb-a0a0-9e2605111a42" containerName="extract-content" Oct 05 20:50:59 crc kubenswrapper[4753]: E1005 20:50:59.180389 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" containerName="extract-utilities" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.180397 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" containerName="extract-utilities" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.180611 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="619a35fc-6a65-42cb-a0a0-9e2605111a42" containerName="registry-server" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.180629 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa2a0a3-d1b8-4ff6-b9f0-acd0d98566fc" containerName="registry-server" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.181332 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.188721 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.189762 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.190080 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.190611 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.190862 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.205004 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk"] Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.332266 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rhdl\" (UniqueName: \"kubernetes.io/projected/91285735-785c-4889-9913-bb3e58ffed5f-kube-api-access-8rhdl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.332399 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.332469 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.332546 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.332579 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.433844 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.433928 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.433984 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.434010 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.434084 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rhdl\" (UniqueName: \"kubernetes.io/projected/91285735-785c-4889-9913-bb3e58ffed5f-kube-api-access-8rhdl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.441109 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.442653 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.445730 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.448352 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.458851 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rhdl\" (UniqueName: \"kubernetes.io/projected/91285735-785c-4889-9913-bb3e58ffed5f-kube-api-access-8rhdl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:50:59 crc kubenswrapper[4753]: I1005 20:50:59.541050 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:51:00 crc kubenswrapper[4753]: I1005 20:51:00.068077 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk"] Oct 05 20:51:00 crc kubenswrapper[4753]: I1005 20:51:00.230468 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" event={"ID":"91285735-785c-4889-9913-bb3e58ffed5f","Type":"ContainerStarted","Data":"c6271b6e84043889a0c87546e7683f62fada2d24461ce50b68a0657a6906ac26"} Oct 05 20:51:01 crc kubenswrapper[4753]: I1005 20:51:01.241470 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" event={"ID":"91285735-785c-4889-9913-bb3e58ffed5f","Type":"ContainerStarted","Data":"30b7777fa547f2edb856c84d00269e26a5dafcfae0c5cf741a3162a5b22f414e"} Oct 05 20:51:01 crc kubenswrapper[4753]: I1005 20:51:01.262013 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" podStartSLOduration=1.78144166 podStartE2EDuration="2.261989879s" podCreationTimestamp="2025-10-05 20:50:59 +0000 UTC" firstStartedPulling="2025-10-05 20:51:00.099716018 +0000 UTC m=+2168.948044250" lastFinishedPulling="2025-10-05 20:51:00.580264197 +0000 UTC m=+2169.428592469" observedRunningTime="2025-10-05 20:51:01.258227732 +0000 UTC m=+2170.106556024" watchObservedRunningTime="2025-10-05 20:51:01.261989879 +0000 UTC m=+2170.110318121" Oct 05 20:51:03 crc kubenswrapper[4753]: I1005 20:51:03.582058 4753 scope.go:117] "RemoveContainer" containerID="6cb93bada9c1cf3185d356d6167b6c8ad40d5b60576815c72028d6dd9215d2fe" Oct 05 20:51:03 crc kubenswrapper[4753]: I1005 20:51:03.615053 4753 scope.go:117] "RemoveContainer" containerID="2ec01b1059ccf847a081ddaae6936d61d267e03806ffcfa6b8c49c4583313ecf" Oct 05 20:51:03 crc kubenswrapper[4753]: I1005 20:51:03.700944 4753 scope.go:117] "RemoveContainer" containerID="5e591415b24672a107360f75a655a19c54cbdbcd5327db7892e22141bbf8c7cc" Oct 05 20:51:03 crc kubenswrapper[4753]: I1005 20:51:03.750373 4753 scope.go:117] "RemoveContainer" containerID="ac40edaaf4c147bed35dc4d220690835d3a96a7ff30069626b8fe66a3f6b5c3f" Oct 05 20:51:03 crc kubenswrapper[4753]: I1005 20:51:03.808128 4753 scope.go:117] "RemoveContainer" containerID="99084cf66eb920dccd49c79c0cced862601fb3880ba396b1d3cc4ecc013c55c7" Oct 05 20:51:03 crc kubenswrapper[4753]: I1005 20:51:03.895821 4753 scope.go:117] "RemoveContainer" containerID="c56cea440ef60fbb4b698d0d8bd8acd986b98357fe70706b0c44084e495e58fc" Oct 05 20:51:03 crc kubenswrapper[4753]: I1005 20:51:03.933577 4753 scope.go:117] "RemoveContainer" containerID="bb531e045fb32d1ccb08ce22dd51084abb98705bf211ecea214b83c57141430b" Oct 05 20:51:03 crc kubenswrapper[4753]: I1005 20:51:03.965082 4753 scope.go:117] "RemoveContainer" containerID="f20abac237a6164b7dd0ca42d13ecda28b61969f1b84cf636c833d7b780bc91a" Oct 05 20:51:04 crc kubenswrapper[4753]: I1005 20:51:04.002386 4753 scope.go:117] "RemoveContainer" containerID="423a7d70191a6277b2e62814df0cc1435d907cc333dbbba7fded356bcbe254c5" Oct 05 20:51:04 crc kubenswrapper[4753]: I1005 20:51:04.066051 4753 scope.go:117] "RemoveContainer" containerID="f8b58365d310f3ee39931e5fdc515716c983721f6a42c06d4a70680276cbc356" Oct 05 20:51:04 crc kubenswrapper[4753]: I1005 20:51:04.105072 4753 scope.go:117] "RemoveContainer" containerID="6c6b8c27a07834f62a910221ea8ffd3da97777898b0a6bd915fb5d2efffd5d68" Oct 05 20:51:13 crc kubenswrapper[4753]: I1005 20:51:13.360922 4753 generic.go:334] "Generic (PLEG): container finished" podID="91285735-785c-4889-9913-bb3e58ffed5f" containerID="30b7777fa547f2edb856c84d00269e26a5dafcfae0c5cf741a3162a5b22f414e" exitCode=0 Oct 05 20:51:13 crc kubenswrapper[4753]: I1005 20:51:13.361046 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" event={"ID":"91285735-785c-4889-9913-bb3e58ffed5f","Type":"ContainerDied","Data":"30b7777fa547f2edb856c84d00269e26a5dafcfae0c5cf741a3162a5b22f414e"} Oct 05 20:51:14 crc kubenswrapper[4753]: I1005 20:51:14.869053 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.044984 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-ceph\") pod \"91285735-785c-4889-9913-bb3e58ffed5f\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.045082 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-ssh-key\") pod \"91285735-785c-4889-9913-bb3e58ffed5f\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.045179 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-repo-setup-combined-ca-bundle\") pod \"91285735-785c-4889-9913-bb3e58ffed5f\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.045227 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-inventory\") pod \"91285735-785c-4889-9913-bb3e58ffed5f\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.045248 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rhdl\" (UniqueName: \"kubernetes.io/projected/91285735-785c-4889-9913-bb3e58ffed5f-kube-api-access-8rhdl\") pod \"91285735-785c-4889-9913-bb3e58ffed5f\" (UID: \"91285735-785c-4889-9913-bb3e58ffed5f\") " Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.051023 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91285735-785c-4889-9913-bb3e58ffed5f-kube-api-access-8rhdl" (OuterVolumeSpecName: "kube-api-access-8rhdl") pod "91285735-785c-4889-9913-bb3e58ffed5f" (UID: "91285735-785c-4889-9913-bb3e58ffed5f"). InnerVolumeSpecName "kube-api-access-8rhdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.052289 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "91285735-785c-4889-9913-bb3e58ffed5f" (UID: "91285735-785c-4889-9913-bb3e58ffed5f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.052379 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-ceph" (OuterVolumeSpecName: "ceph") pod "91285735-785c-4889-9913-bb3e58ffed5f" (UID: "91285735-785c-4889-9913-bb3e58ffed5f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.076721 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-inventory" (OuterVolumeSpecName: "inventory") pod "91285735-785c-4889-9913-bb3e58ffed5f" (UID: "91285735-785c-4889-9913-bb3e58ffed5f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.078696 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "91285735-785c-4889-9913-bb3e58ffed5f" (UID: "91285735-785c-4889-9913-bb3e58ffed5f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.147345 4753 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.147379 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.147393 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rhdl\" (UniqueName: \"kubernetes.io/projected/91285735-785c-4889-9913-bb3e58ffed5f-kube-api-access-8rhdl\") on node \"crc\" DevicePath \"\"" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.147405 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.147412 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91285735-785c-4889-9913-bb3e58ffed5f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.379592 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" event={"ID":"91285735-785c-4889-9913-bb3e58ffed5f","Type":"ContainerDied","Data":"c6271b6e84043889a0c87546e7683f62fada2d24461ce50b68a0657a6906ac26"} Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.379641 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6271b6e84043889a0c87546e7683f62fada2d24461ce50b68a0657a6906ac26" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.379711 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.479632 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv"] Oct 05 20:51:15 crc kubenswrapper[4753]: E1005 20:51:15.480526 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91285735-785c-4889-9913-bb3e58ffed5f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.480562 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="91285735-785c-4889-9913-bb3e58ffed5f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.483067 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="91285735-785c-4889-9913-bb3e58ffed5f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.484123 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.487310 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv"] Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.490519 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.490659 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.490825 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.493273 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.493796 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.657245 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.657321 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.657369 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwmzs\" (UniqueName: \"kubernetes.io/projected/2e0a083e-4f35-4cbf-89af-348a03a81159-kube-api-access-lwmzs\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.657605 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.657736 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.760009 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwmzs\" (UniqueName: \"kubernetes.io/projected/2e0a083e-4f35-4cbf-89af-348a03a81159-kube-api-access-lwmzs\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.760112 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.760189 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.760281 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.760840 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.763994 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.764000 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.764625 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.772080 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.783813 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwmzs\" (UniqueName: \"kubernetes.io/projected/2e0a083e-4f35-4cbf-89af-348a03a81159-kube-api-access-lwmzs\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:15 crc kubenswrapper[4753]: I1005 20:51:15.823124 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:51:16 crc kubenswrapper[4753]: I1005 20:51:16.354892 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv"] Oct 05 20:51:16 crc kubenswrapper[4753]: W1005 20:51:16.361988 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e0a083e_4f35_4cbf_89af_348a03a81159.slice/crio-5705106adb03415836614a50e7e2611206662ccc58e5f4bbe29f54b7820a8374 WatchSource:0}: Error finding container 5705106adb03415836614a50e7e2611206662ccc58e5f4bbe29f54b7820a8374: Status 404 returned error can't find the container with id 5705106adb03415836614a50e7e2611206662ccc58e5f4bbe29f54b7820a8374 Oct 05 20:51:16 crc kubenswrapper[4753]: I1005 20:51:16.391228 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" event={"ID":"2e0a083e-4f35-4cbf-89af-348a03a81159","Type":"ContainerStarted","Data":"5705106adb03415836614a50e7e2611206662ccc58e5f4bbe29f54b7820a8374"} Oct 05 20:51:17 crc kubenswrapper[4753]: I1005 20:51:17.399208 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" event={"ID":"2e0a083e-4f35-4cbf-89af-348a03a81159","Type":"ContainerStarted","Data":"a710e06f23e842e668b031b8e51d8127f18e082e8a854259a5acad99a54ed22d"} Oct 05 20:51:17 crc kubenswrapper[4753]: I1005 20:51:17.416570 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" podStartSLOduration=1.986618488 podStartE2EDuration="2.416551945s" podCreationTimestamp="2025-10-05 20:51:15 +0000 UTC" firstStartedPulling="2025-10-05 20:51:16.364034661 +0000 UTC m=+2185.212362893" lastFinishedPulling="2025-10-05 20:51:16.793968118 +0000 UTC m=+2185.642296350" observedRunningTime="2025-10-05 20:51:17.413173609 +0000 UTC m=+2186.261501851" watchObservedRunningTime="2025-10-05 20:51:17.416551945 +0000 UTC m=+2186.264880177" Oct 05 20:51:34 crc kubenswrapper[4753]: I1005 20:51:34.489975 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:51:34 crc kubenswrapper[4753]: I1005 20:51:34.490667 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:52:04 crc kubenswrapper[4753]: I1005 20:52:04.489857 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:52:04 crc kubenswrapper[4753]: I1005 20:52:04.491504 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:52:34 crc kubenswrapper[4753]: I1005 20:52:34.490757 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 20:52:34 crc kubenswrapper[4753]: I1005 20:52:34.491600 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 20:52:34 crc kubenswrapper[4753]: I1005 20:52:34.491662 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 20:52:34 crc kubenswrapper[4753]: I1005 20:52:34.492680 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 20:52:34 crc kubenswrapper[4753]: I1005 20:52:34.492766 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" gracePeriod=600 Oct 05 20:52:34 crc kubenswrapper[4753]: E1005 20:52:34.617310 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:52:35 crc kubenswrapper[4753]: I1005 20:52:35.096934 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" exitCode=0 Oct 05 20:52:35 crc kubenswrapper[4753]: I1005 20:52:35.096982 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41"} Oct 05 20:52:35 crc kubenswrapper[4753]: I1005 20:52:35.097017 4753 scope.go:117] "RemoveContainer" containerID="5952695b2bbac0c6b13de72d6aaeea0b402956b255b3d9a665facde6fb89aed3" Oct 05 20:52:35 crc kubenswrapper[4753]: I1005 20:52:35.097389 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:52:35 crc kubenswrapper[4753]: E1005 20:52:35.097626 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:52:49 crc kubenswrapper[4753]: I1005 20:52:49.852117 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:52:49 crc kubenswrapper[4753]: E1005 20:52:49.853984 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:53:02 crc kubenswrapper[4753]: I1005 20:53:02.852757 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:53:02 crc kubenswrapper[4753]: E1005 20:53:02.853544 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:53:05 crc kubenswrapper[4753]: I1005 20:53:05.369619 4753 generic.go:334] "Generic (PLEG): container finished" podID="2e0a083e-4f35-4cbf-89af-348a03a81159" containerID="a710e06f23e842e668b031b8e51d8127f18e082e8a854259a5acad99a54ed22d" exitCode=0 Oct 05 20:53:05 crc kubenswrapper[4753]: I1005 20:53:05.369818 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" event={"ID":"2e0a083e-4f35-4cbf-89af-348a03a81159","Type":"ContainerDied","Data":"a710e06f23e842e668b031b8e51d8127f18e082e8a854259a5acad99a54ed22d"} Oct 05 20:53:06 crc kubenswrapper[4753]: I1005 20:53:06.825038 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:53:06 crc kubenswrapper[4753]: I1005 20:53:06.863720 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-bootstrap-combined-ca-bundle\") pod \"2e0a083e-4f35-4cbf-89af-348a03a81159\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " Oct 05 20:53:06 crc kubenswrapper[4753]: I1005 20:53:06.863791 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-inventory\") pod \"2e0a083e-4f35-4cbf-89af-348a03a81159\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " Oct 05 20:53:06 crc kubenswrapper[4753]: I1005 20:53:06.863822 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-ssh-key\") pod \"2e0a083e-4f35-4cbf-89af-348a03a81159\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " Oct 05 20:53:06 crc kubenswrapper[4753]: I1005 20:53:06.863865 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwmzs\" (UniqueName: \"kubernetes.io/projected/2e0a083e-4f35-4cbf-89af-348a03a81159-kube-api-access-lwmzs\") pod \"2e0a083e-4f35-4cbf-89af-348a03a81159\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " Oct 05 20:53:06 crc kubenswrapper[4753]: I1005 20:53:06.863986 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-ceph\") pod \"2e0a083e-4f35-4cbf-89af-348a03a81159\" (UID: \"2e0a083e-4f35-4cbf-89af-348a03a81159\") " Oct 05 20:53:06 crc kubenswrapper[4753]: I1005 20:53:06.886718 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-ceph" (OuterVolumeSpecName: "ceph") pod "2e0a083e-4f35-4cbf-89af-348a03a81159" (UID: "2e0a083e-4f35-4cbf-89af-348a03a81159"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:53:06 crc kubenswrapper[4753]: I1005 20:53:06.890673 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e0a083e-4f35-4cbf-89af-348a03a81159-kube-api-access-lwmzs" (OuterVolumeSpecName: "kube-api-access-lwmzs") pod "2e0a083e-4f35-4cbf-89af-348a03a81159" (UID: "2e0a083e-4f35-4cbf-89af-348a03a81159"). InnerVolumeSpecName "kube-api-access-lwmzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:53:06 crc kubenswrapper[4753]: I1005 20:53:06.894224 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2e0a083e-4f35-4cbf-89af-348a03a81159" (UID: "2e0a083e-4f35-4cbf-89af-348a03a81159"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:53:06 crc kubenswrapper[4753]: I1005 20:53:06.895686 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2e0a083e-4f35-4cbf-89af-348a03a81159" (UID: "2e0a083e-4f35-4cbf-89af-348a03a81159"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:53:06 crc kubenswrapper[4753]: I1005 20:53:06.901352 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-inventory" (OuterVolumeSpecName: "inventory") pod "2e0a083e-4f35-4cbf-89af-348a03a81159" (UID: "2e0a083e-4f35-4cbf-89af-348a03a81159"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:53:06 crc kubenswrapper[4753]: I1005 20:53:06.966127 4753 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:53:06 crc kubenswrapper[4753]: I1005 20:53:06.966176 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:53:06 crc kubenswrapper[4753]: I1005 20:53:06.966187 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:53:06 crc kubenswrapper[4753]: I1005 20:53:06.966197 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwmzs\" (UniqueName: \"kubernetes.io/projected/2e0a083e-4f35-4cbf-89af-348a03a81159-kube-api-access-lwmzs\") on node \"crc\" DevicePath \"\"" Oct 05 20:53:06 crc kubenswrapper[4753]: I1005 20:53:06.966206 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e0a083e-4f35-4cbf-89af-348a03a81159-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.387738 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" event={"ID":"2e0a083e-4f35-4cbf-89af-348a03a81159","Type":"ContainerDied","Data":"5705106adb03415836614a50e7e2611206662ccc58e5f4bbe29f54b7820a8374"} Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.387776 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5705106adb03415836614a50e7e2611206662ccc58e5f4bbe29f54b7820a8374" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.388124 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.485075 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn"] Oct 05 20:53:07 crc kubenswrapper[4753]: E1005 20:53:07.485411 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e0a083e-4f35-4cbf-89af-348a03a81159" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.485427 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0a083e-4f35-4cbf-89af-348a03a81159" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.485618 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e0a083e-4f35-4cbf-89af-348a03a81159" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.486126 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.488227 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.488339 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.488506 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.488908 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.489053 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.506477 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn"] Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.581957 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn\" (UID: \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.582014 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk2ms\" (UniqueName: \"kubernetes.io/projected/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-kube-api-access-qk2ms\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn\" (UID: \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.582043 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn\" (UID: \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.582092 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn\" (UID: \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.683352 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn\" (UID: \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.683470 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn\" (UID: \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.683496 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk2ms\" (UniqueName: \"kubernetes.io/projected/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-kube-api-access-qk2ms\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn\" (UID: \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.683517 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn\" (UID: \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.696922 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn\" (UID: \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.697462 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn\" (UID: \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.698081 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn\" (UID: \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.704836 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk2ms\" (UniqueName: \"kubernetes.io/projected/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-kube-api-access-qk2ms\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn\" (UID: \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" Oct 05 20:53:07 crc kubenswrapper[4753]: I1005 20:53:07.813052 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" Oct 05 20:53:08 crc kubenswrapper[4753]: I1005 20:53:08.324717 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn"] Oct 05 20:53:08 crc kubenswrapper[4753]: I1005 20:53:08.394695 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" event={"ID":"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37","Type":"ContainerStarted","Data":"235960f9cac8ee05b09752d9eaf7dbbb8337e328240b96ef8909fd5c9496df71"} Oct 05 20:53:09 crc kubenswrapper[4753]: I1005 20:53:09.404761 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" event={"ID":"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37","Type":"ContainerStarted","Data":"e829aca3e37f38b5efec2174df01553ac2aaeaf7ab2e54f28155685915baee84"} Oct 05 20:53:10 crc kubenswrapper[4753]: I1005 20:53:10.554470 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" podStartSLOduration=3.090012192 podStartE2EDuration="3.55445131s" podCreationTimestamp="2025-10-05 20:53:07 +0000 UTC" firstStartedPulling="2025-10-05 20:53:08.327189418 +0000 UTC m=+2297.175517650" lastFinishedPulling="2025-10-05 20:53:08.791628546 +0000 UTC m=+2297.639956768" observedRunningTime="2025-10-05 20:53:09.461542452 +0000 UTC m=+2298.309870684" watchObservedRunningTime="2025-10-05 20:53:10.55445131 +0000 UTC m=+2299.402779542" Oct 05 20:53:10 crc kubenswrapper[4753]: I1005 20:53:10.557122 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vrk58"] Oct 05 20:53:10 crc kubenswrapper[4753]: I1005 20:53:10.558858 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:10 crc kubenswrapper[4753]: I1005 20:53:10.567704 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vrk58"] Oct 05 20:53:10 crc kubenswrapper[4753]: I1005 20:53:10.746710 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlkqz\" (UniqueName: \"kubernetes.io/projected/8823f56c-431a-4c00-963f-310b13768c19-kube-api-access-nlkqz\") pod \"certified-operators-vrk58\" (UID: \"8823f56c-431a-4c00-963f-310b13768c19\") " pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:10 crc kubenswrapper[4753]: I1005 20:53:10.747052 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8823f56c-431a-4c00-963f-310b13768c19-utilities\") pod \"certified-operators-vrk58\" (UID: \"8823f56c-431a-4c00-963f-310b13768c19\") " pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:10 crc kubenswrapper[4753]: I1005 20:53:10.747070 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8823f56c-431a-4c00-963f-310b13768c19-catalog-content\") pod \"certified-operators-vrk58\" (UID: \"8823f56c-431a-4c00-963f-310b13768c19\") " pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:10 crc kubenswrapper[4753]: I1005 20:53:10.848807 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlkqz\" (UniqueName: \"kubernetes.io/projected/8823f56c-431a-4c00-963f-310b13768c19-kube-api-access-nlkqz\") pod \"certified-operators-vrk58\" (UID: \"8823f56c-431a-4c00-963f-310b13768c19\") " pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:10 crc kubenswrapper[4753]: I1005 20:53:10.848855 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8823f56c-431a-4c00-963f-310b13768c19-utilities\") pod \"certified-operators-vrk58\" (UID: \"8823f56c-431a-4c00-963f-310b13768c19\") " pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:10 crc kubenswrapper[4753]: I1005 20:53:10.848875 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8823f56c-431a-4c00-963f-310b13768c19-catalog-content\") pod \"certified-operators-vrk58\" (UID: \"8823f56c-431a-4c00-963f-310b13768c19\") " pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:10 crc kubenswrapper[4753]: I1005 20:53:10.849397 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8823f56c-431a-4c00-963f-310b13768c19-catalog-content\") pod \"certified-operators-vrk58\" (UID: \"8823f56c-431a-4c00-963f-310b13768c19\") " pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:10 crc kubenswrapper[4753]: I1005 20:53:10.849572 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8823f56c-431a-4c00-963f-310b13768c19-utilities\") pod \"certified-operators-vrk58\" (UID: \"8823f56c-431a-4c00-963f-310b13768c19\") " pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:10 crc kubenswrapper[4753]: I1005 20:53:10.875168 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlkqz\" (UniqueName: \"kubernetes.io/projected/8823f56c-431a-4c00-963f-310b13768c19-kube-api-access-nlkqz\") pod \"certified-operators-vrk58\" (UID: \"8823f56c-431a-4c00-963f-310b13768c19\") " pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:10 crc kubenswrapper[4753]: I1005 20:53:10.901575 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:11 crc kubenswrapper[4753]: I1005 20:53:11.390164 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vrk58"] Oct 05 20:53:11 crc kubenswrapper[4753]: I1005 20:53:11.422048 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrk58" event={"ID":"8823f56c-431a-4c00-963f-310b13768c19","Type":"ContainerStarted","Data":"971edb0fdfbc73f11786391bf36734d52367a044c84d91cb254c2ca59b17740e"} Oct 05 20:53:12 crc kubenswrapper[4753]: I1005 20:53:12.431361 4753 generic.go:334] "Generic (PLEG): container finished" podID="8823f56c-431a-4c00-963f-310b13768c19" containerID="3be0395a45bd99050796265d07fbe79736a61741d425a2147a6de64b45a0bceb" exitCode=0 Oct 05 20:53:12 crc kubenswrapper[4753]: I1005 20:53:12.431418 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrk58" event={"ID":"8823f56c-431a-4c00-963f-310b13768c19","Type":"ContainerDied","Data":"3be0395a45bd99050796265d07fbe79736a61741d425a2147a6de64b45a0bceb"} Oct 05 20:53:13 crc kubenswrapper[4753]: I1005 20:53:13.439504 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrk58" event={"ID":"8823f56c-431a-4c00-963f-310b13768c19","Type":"ContainerStarted","Data":"5038456d4c56615ff38b3969fdffdd28822cbcd0f057db98b9b0285e402af397"} Oct 05 20:53:14 crc kubenswrapper[4753]: I1005 20:53:14.451413 4753 generic.go:334] "Generic (PLEG): container finished" podID="8823f56c-431a-4c00-963f-310b13768c19" containerID="5038456d4c56615ff38b3969fdffdd28822cbcd0f057db98b9b0285e402af397" exitCode=0 Oct 05 20:53:14 crc kubenswrapper[4753]: I1005 20:53:14.451516 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrk58" event={"ID":"8823f56c-431a-4c00-963f-310b13768c19","Type":"ContainerDied","Data":"5038456d4c56615ff38b3969fdffdd28822cbcd0f057db98b9b0285e402af397"} Oct 05 20:53:15 crc kubenswrapper[4753]: I1005 20:53:15.462189 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrk58" event={"ID":"8823f56c-431a-4c00-963f-310b13768c19","Type":"ContainerStarted","Data":"e9d660e3272e405dfe02cf95e89f3d4fd734218072e5c1cfd0f4e64cd3a47240"} Oct 05 20:53:15 crc kubenswrapper[4753]: I1005 20:53:15.496336 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vrk58" podStartSLOduration=3.03479522 podStartE2EDuration="5.496305933s" podCreationTimestamp="2025-10-05 20:53:10 +0000 UTC" firstStartedPulling="2025-10-05 20:53:12.434504184 +0000 UTC m=+2301.282832426" lastFinishedPulling="2025-10-05 20:53:14.896014897 +0000 UTC m=+2303.744343139" observedRunningTime="2025-10-05 20:53:15.481641718 +0000 UTC m=+2304.329969990" watchObservedRunningTime="2025-10-05 20:53:15.496305933 +0000 UTC m=+2304.344634195" Oct 05 20:53:15 crc kubenswrapper[4753]: I1005 20:53:15.852597 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:53:15 crc kubenswrapper[4753]: E1005 20:53:15.852920 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:53:20 crc kubenswrapper[4753]: I1005 20:53:20.902675 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:20 crc kubenswrapper[4753]: I1005 20:53:20.903017 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:20 crc kubenswrapper[4753]: I1005 20:53:20.957702 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:21 crc kubenswrapper[4753]: I1005 20:53:21.554039 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:21 crc kubenswrapper[4753]: I1005 20:53:21.612964 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vrk58"] Oct 05 20:53:23 crc kubenswrapper[4753]: I1005 20:53:23.528526 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vrk58" podUID="8823f56c-431a-4c00-963f-310b13768c19" containerName="registry-server" containerID="cri-o://e9d660e3272e405dfe02cf95e89f3d4fd734218072e5c1cfd0f4e64cd3a47240" gracePeriod=2 Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.489632 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.541448 4753 generic.go:334] "Generic (PLEG): container finished" podID="8823f56c-431a-4c00-963f-310b13768c19" containerID="e9d660e3272e405dfe02cf95e89f3d4fd734218072e5c1cfd0f4e64cd3a47240" exitCode=0 Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.541496 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrk58" event={"ID":"8823f56c-431a-4c00-963f-310b13768c19","Type":"ContainerDied","Data":"e9d660e3272e405dfe02cf95e89f3d4fd734218072e5c1cfd0f4e64cd3a47240"} Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.541532 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrk58" event={"ID":"8823f56c-431a-4c00-963f-310b13768c19","Type":"ContainerDied","Data":"971edb0fdfbc73f11786391bf36734d52367a044c84d91cb254c2ca59b17740e"} Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.541575 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrk58" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.541620 4753 scope.go:117] "RemoveContainer" containerID="e9d660e3272e405dfe02cf95e89f3d4fd734218072e5c1cfd0f4e64cd3a47240" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.570945 4753 scope.go:117] "RemoveContainer" containerID="5038456d4c56615ff38b3969fdffdd28822cbcd0f057db98b9b0285e402af397" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.599347 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8823f56c-431a-4c00-963f-310b13768c19-catalog-content\") pod \"8823f56c-431a-4c00-963f-310b13768c19\" (UID: \"8823f56c-431a-4c00-963f-310b13768c19\") " Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.599913 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlkqz\" (UniqueName: \"kubernetes.io/projected/8823f56c-431a-4c00-963f-310b13768c19-kube-api-access-nlkqz\") pod \"8823f56c-431a-4c00-963f-310b13768c19\" (UID: \"8823f56c-431a-4c00-963f-310b13768c19\") " Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.599939 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8823f56c-431a-4c00-963f-310b13768c19-utilities\") pod \"8823f56c-431a-4c00-963f-310b13768c19\" (UID: \"8823f56c-431a-4c00-963f-310b13768c19\") " Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.600856 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8823f56c-431a-4c00-963f-310b13768c19-utilities" (OuterVolumeSpecName: "utilities") pod "8823f56c-431a-4c00-963f-310b13768c19" (UID: "8823f56c-431a-4c00-963f-310b13768c19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.610025 4753 scope.go:117] "RemoveContainer" containerID="3be0395a45bd99050796265d07fbe79736a61741d425a2147a6de64b45a0bceb" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.610105 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8823f56c-431a-4c00-963f-310b13768c19-kube-api-access-nlkqz" (OuterVolumeSpecName: "kube-api-access-nlkqz") pod "8823f56c-431a-4c00-963f-310b13768c19" (UID: "8823f56c-431a-4c00-963f-310b13768c19"). InnerVolumeSpecName "kube-api-access-nlkqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.643799 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8823f56c-431a-4c00-963f-310b13768c19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8823f56c-431a-4c00-963f-310b13768c19" (UID: "8823f56c-431a-4c00-963f-310b13768c19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.659764 4753 scope.go:117] "RemoveContainer" containerID="e9d660e3272e405dfe02cf95e89f3d4fd734218072e5c1cfd0f4e64cd3a47240" Oct 05 20:53:24 crc kubenswrapper[4753]: E1005 20:53:24.660445 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9d660e3272e405dfe02cf95e89f3d4fd734218072e5c1cfd0f4e64cd3a47240\": container with ID starting with e9d660e3272e405dfe02cf95e89f3d4fd734218072e5c1cfd0f4e64cd3a47240 not found: ID does not exist" containerID="e9d660e3272e405dfe02cf95e89f3d4fd734218072e5c1cfd0f4e64cd3a47240" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.660539 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9d660e3272e405dfe02cf95e89f3d4fd734218072e5c1cfd0f4e64cd3a47240"} err="failed to get container status \"e9d660e3272e405dfe02cf95e89f3d4fd734218072e5c1cfd0f4e64cd3a47240\": rpc error: code = NotFound desc = could not find container \"e9d660e3272e405dfe02cf95e89f3d4fd734218072e5c1cfd0f4e64cd3a47240\": container with ID starting with e9d660e3272e405dfe02cf95e89f3d4fd734218072e5c1cfd0f4e64cd3a47240 not found: ID does not exist" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.660650 4753 scope.go:117] "RemoveContainer" containerID="5038456d4c56615ff38b3969fdffdd28822cbcd0f057db98b9b0285e402af397" Oct 05 20:53:24 crc kubenswrapper[4753]: E1005 20:53:24.661132 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5038456d4c56615ff38b3969fdffdd28822cbcd0f057db98b9b0285e402af397\": container with ID starting with 5038456d4c56615ff38b3969fdffdd28822cbcd0f057db98b9b0285e402af397 not found: ID does not exist" containerID="5038456d4c56615ff38b3969fdffdd28822cbcd0f057db98b9b0285e402af397" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.661189 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5038456d4c56615ff38b3969fdffdd28822cbcd0f057db98b9b0285e402af397"} err="failed to get container status \"5038456d4c56615ff38b3969fdffdd28822cbcd0f057db98b9b0285e402af397\": rpc error: code = NotFound desc = could not find container \"5038456d4c56615ff38b3969fdffdd28822cbcd0f057db98b9b0285e402af397\": container with ID starting with 5038456d4c56615ff38b3969fdffdd28822cbcd0f057db98b9b0285e402af397 not found: ID does not exist" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.661228 4753 scope.go:117] "RemoveContainer" containerID="3be0395a45bd99050796265d07fbe79736a61741d425a2147a6de64b45a0bceb" Oct 05 20:53:24 crc kubenswrapper[4753]: E1005 20:53:24.661483 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3be0395a45bd99050796265d07fbe79736a61741d425a2147a6de64b45a0bceb\": container with ID starting with 3be0395a45bd99050796265d07fbe79736a61741d425a2147a6de64b45a0bceb not found: ID does not exist" containerID="3be0395a45bd99050796265d07fbe79736a61741d425a2147a6de64b45a0bceb" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.661507 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3be0395a45bd99050796265d07fbe79736a61741d425a2147a6de64b45a0bceb"} err="failed to get container status \"3be0395a45bd99050796265d07fbe79736a61741d425a2147a6de64b45a0bceb\": rpc error: code = NotFound desc = could not find container \"3be0395a45bd99050796265d07fbe79736a61741d425a2147a6de64b45a0bceb\": container with ID starting with 3be0395a45bd99050796265d07fbe79736a61741d425a2147a6de64b45a0bceb not found: ID does not exist" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.704038 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8823f56c-431a-4c00-963f-310b13768c19-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.704068 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlkqz\" (UniqueName: \"kubernetes.io/projected/8823f56c-431a-4c00-963f-310b13768c19-kube-api-access-nlkqz\") on node \"crc\" DevicePath \"\"" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.704078 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8823f56c-431a-4c00-963f-310b13768c19-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.872978 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vrk58"] Oct 05 20:53:24 crc kubenswrapper[4753]: I1005 20:53:24.881170 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vrk58"] Oct 05 20:53:25 crc kubenswrapper[4753]: I1005 20:53:25.862015 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8823f56c-431a-4c00-963f-310b13768c19" path="/var/lib/kubelet/pods/8823f56c-431a-4c00-963f-310b13768c19/volumes" Oct 05 20:53:26 crc kubenswrapper[4753]: I1005 20:53:26.852912 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:53:26 crc kubenswrapper[4753]: E1005 20:53:26.853388 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:53:38 crc kubenswrapper[4753]: I1005 20:53:38.852583 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:53:38 crc kubenswrapper[4753]: E1005 20:53:38.853363 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:53:39 crc kubenswrapper[4753]: I1005 20:53:39.676595 4753 generic.go:334] "Generic (PLEG): container finished" podID="a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37" containerID="e829aca3e37f38b5efec2174df01553ac2aaeaf7ab2e54f28155685915baee84" exitCode=0 Oct 05 20:53:39 crc kubenswrapper[4753]: I1005 20:53:39.676678 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" event={"ID":"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37","Type":"ContainerDied","Data":"e829aca3e37f38b5efec2174df01553ac2aaeaf7ab2e54f28155685915baee84"} Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.114382 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.230795 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-ssh-key\") pod \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\" (UID: \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\") " Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.230893 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-inventory\") pod \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\" (UID: \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\") " Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.231113 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk2ms\" (UniqueName: \"kubernetes.io/projected/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-kube-api-access-qk2ms\") pod \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\" (UID: \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\") " Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.231182 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-ceph\") pod \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\" (UID: \"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37\") " Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.236383 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-ceph" (OuterVolumeSpecName: "ceph") pod "a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37" (UID: "a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.236474 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-kube-api-access-qk2ms" (OuterVolumeSpecName: "kube-api-access-qk2ms") pod "a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37" (UID: "a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37"). InnerVolumeSpecName "kube-api-access-qk2ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.261398 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-inventory" (OuterVolumeSpecName: "inventory") pod "a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37" (UID: "a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.261505 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37" (UID: "a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.333133 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk2ms\" (UniqueName: \"kubernetes.io/projected/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-kube-api-access-qk2ms\") on node \"crc\" DevicePath \"\"" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.333193 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.333202 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.333212 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.693005 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" event={"ID":"a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37","Type":"ContainerDied","Data":"235960f9cac8ee05b09752d9eaf7dbbb8337e328240b96ef8909fd5c9496df71"} Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.693042 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="235960f9cac8ee05b09752d9eaf7dbbb8337e328240b96ef8909fd5c9496df71" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.693242 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.788378 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v"] Oct 05 20:53:41 crc kubenswrapper[4753]: E1005 20:53:41.788969 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8823f56c-431a-4c00-963f-310b13768c19" containerName="registry-server" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.788987 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8823f56c-431a-4c00-963f-310b13768c19" containerName="registry-server" Oct 05 20:53:41 crc kubenswrapper[4753]: E1005 20:53:41.789000 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.789007 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 05 20:53:41 crc kubenswrapper[4753]: E1005 20:53:41.789037 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8823f56c-431a-4c00-963f-310b13768c19" containerName="extract-utilities" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.789043 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8823f56c-431a-4c00-963f-310b13768c19" containerName="extract-utilities" Oct 05 20:53:41 crc kubenswrapper[4753]: E1005 20:53:41.789059 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8823f56c-431a-4c00-963f-310b13768c19" containerName="extract-content" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.789065 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8823f56c-431a-4c00-963f-310b13768c19" containerName="extract-content" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.789249 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.789281 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="8823f56c-431a-4c00-963f-310b13768c19" containerName="registry-server" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.789801 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.793500 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.793733 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.793826 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.793959 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.794274 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.814769 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v"] Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.944629 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng248\" (UniqueName: \"kubernetes.io/projected/e9938f80-4e3c-476e-bd1d-11e1646d9176-kube-api-access-ng248\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xs45v\" (UID: \"e9938f80-4e3c-476e-bd1d-11e1646d9176\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.945495 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xs45v\" (UID: \"e9938f80-4e3c-476e-bd1d-11e1646d9176\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.945521 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xs45v\" (UID: \"e9938f80-4e3c-476e-bd1d-11e1646d9176\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" Oct 05 20:53:41 crc kubenswrapper[4753]: I1005 20:53:41.946219 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xs45v\" (UID: \"e9938f80-4e3c-476e-bd1d-11e1646d9176\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" Oct 05 20:53:42 crc kubenswrapper[4753]: I1005 20:53:42.048598 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xs45v\" (UID: \"e9938f80-4e3c-476e-bd1d-11e1646d9176\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" Oct 05 20:53:42 crc kubenswrapper[4753]: I1005 20:53:42.048708 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng248\" (UniqueName: \"kubernetes.io/projected/e9938f80-4e3c-476e-bd1d-11e1646d9176-kube-api-access-ng248\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xs45v\" (UID: \"e9938f80-4e3c-476e-bd1d-11e1646d9176\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" Oct 05 20:53:42 crc kubenswrapper[4753]: I1005 20:53:42.048735 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xs45v\" (UID: \"e9938f80-4e3c-476e-bd1d-11e1646d9176\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" Oct 05 20:53:42 crc kubenswrapper[4753]: I1005 20:53:42.048754 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xs45v\" (UID: \"e9938f80-4e3c-476e-bd1d-11e1646d9176\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" Oct 05 20:53:42 crc kubenswrapper[4753]: I1005 20:53:42.053680 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xs45v\" (UID: \"e9938f80-4e3c-476e-bd1d-11e1646d9176\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" Oct 05 20:53:42 crc kubenswrapper[4753]: I1005 20:53:42.054615 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xs45v\" (UID: \"e9938f80-4e3c-476e-bd1d-11e1646d9176\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" Oct 05 20:53:42 crc kubenswrapper[4753]: I1005 20:53:42.058348 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xs45v\" (UID: \"e9938f80-4e3c-476e-bd1d-11e1646d9176\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" Oct 05 20:53:42 crc kubenswrapper[4753]: I1005 20:53:42.066102 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng248\" (UniqueName: \"kubernetes.io/projected/e9938f80-4e3c-476e-bd1d-11e1646d9176-kube-api-access-ng248\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xs45v\" (UID: \"e9938f80-4e3c-476e-bd1d-11e1646d9176\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" Oct 05 20:53:42 crc kubenswrapper[4753]: I1005 20:53:42.110206 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" Oct 05 20:53:42 crc kubenswrapper[4753]: I1005 20:53:42.643873 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v"] Oct 05 20:53:42 crc kubenswrapper[4753]: I1005 20:53:42.700121 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" event={"ID":"e9938f80-4e3c-476e-bd1d-11e1646d9176","Type":"ContainerStarted","Data":"80d952923157f8200347ee94ad58192db4e14e5e63f26ae8ba8401112ea582ff"} Oct 05 20:53:43 crc kubenswrapper[4753]: I1005 20:53:43.707514 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" event={"ID":"e9938f80-4e3c-476e-bd1d-11e1646d9176","Type":"ContainerStarted","Data":"2b8f860ab49a7dc8994146d9b17c39c487774a2f2003592bc03be61134c3453e"} Oct 05 20:53:43 crc kubenswrapper[4753]: I1005 20:53:43.732938 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" podStartSLOduration=2.132728655 podStartE2EDuration="2.732916767s" podCreationTimestamp="2025-10-05 20:53:41 +0000 UTC" firstStartedPulling="2025-10-05 20:53:42.652618261 +0000 UTC m=+2331.500946493" lastFinishedPulling="2025-10-05 20:53:43.252806373 +0000 UTC m=+2332.101134605" observedRunningTime="2025-10-05 20:53:43.725453966 +0000 UTC m=+2332.573782198" watchObservedRunningTime="2025-10-05 20:53:43.732916767 +0000 UTC m=+2332.581244999" Oct 05 20:53:49 crc kubenswrapper[4753]: I1005 20:53:49.763101 4753 generic.go:334] "Generic (PLEG): container finished" podID="e9938f80-4e3c-476e-bd1d-11e1646d9176" containerID="2b8f860ab49a7dc8994146d9b17c39c487774a2f2003592bc03be61134c3453e" exitCode=0 Oct 05 20:53:49 crc kubenswrapper[4753]: I1005 20:53:49.763219 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" event={"ID":"e9938f80-4e3c-476e-bd1d-11e1646d9176","Type":"ContainerDied","Data":"2b8f860ab49a7dc8994146d9b17c39c487774a2f2003592bc03be61134c3453e"} Oct 05 20:53:49 crc kubenswrapper[4753]: I1005 20:53:49.853008 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:53:49 crc kubenswrapper[4753]: E1005 20:53:49.853497 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.189663 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.313848 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng248\" (UniqueName: \"kubernetes.io/projected/e9938f80-4e3c-476e-bd1d-11e1646d9176-kube-api-access-ng248\") pod \"e9938f80-4e3c-476e-bd1d-11e1646d9176\" (UID: \"e9938f80-4e3c-476e-bd1d-11e1646d9176\") " Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.314158 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-inventory\") pod \"e9938f80-4e3c-476e-bd1d-11e1646d9176\" (UID: \"e9938f80-4e3c-476e-bd1d-11e1646d9176\") " Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.314250 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-ceph\") pod \"e9938f80-4e3c-476e-bd1d-11e1646d9176\" (UID: \"e9938f80-4e3c-476e-bd1d-11e1646d9176\") " Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.314378 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-ssh-key\") pod \"e9938f80-4e3c-476e-bd1d-11e1646d9176\" (UID: \"e9938f80-4e3c-476e-bd1d-11e1646d9176\") " Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.320549 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-ceph" (OuterVolumeSpecName: "ceph") pod "e9938f80-4e3c-476e-bd1d-11e1646d9176" (UID: "e9938f80-4e3c-476e-bd1d-11e1646d9176"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.321518 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9938f80-4e3c-476e-bd1d-11e1646d9176-kube-api-access-ng248" (OuterVolumeSpecName: "kube-api-access-ng248") pod "e9938f80-4e3c-476e-bd1d-11e1646d9176" (UID: "e9938f80-4e3c-476e-bd1d-11e1646d9176"). InnerVolumeSpecName "kube-api-access-ng248". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.354031 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-inventory" (OuterVolumeSpecName: "inventory") pod "e9938f80-4e3c-476e-bd1d-11e1646d9176" (UID: "e9938f80-4e3c-476e-bd1d-11e1646d9176"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.362050 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e9938f80-4e3c-476e-bd1d-11e1646d9176" (UID: "e9938f80-4e3c-476e-bd1d-11e1646d9176"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.416521 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng248\" (UniqueName: \"kubernetes.io/projected/e9938f80-4e3c-476e-bd1d-11e1646d9176-kube-api-access-ng248\") on node \"crc\" DevicePath \"\"" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.416555 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.416568 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.416580 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9938f80-4e3c-476e-bd1d-11e1646d9176-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.781315 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" event={"ID":"e9938f80-4e3c-476e-bd1d-11e1646d9176","Type":"ContainerDied","Data":"80d952923157f8200347ee94ad58192db4e14e5e63f26ae8ba8401112ea582ff"} Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.781351 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80d952923157f8200347ee94ad58192db4e14e5e63f26ae8ba8401112ea582ff" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.781376 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xs45v" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.882596 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn"] Oct 05 20:53:51 crc kubenswrapper[4753]: E1005 20:53:51.883125 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9938f80-4e3c-476e-bd1d-11e1646d9176" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.883244 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9938f80-4e3c-476e-bd1d-11e1646d9176" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.887450 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9938f80-4e3c-476e-bd1d-11e1646d9176" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.892824 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.896033 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.896407 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.897708 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.899728 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.900663 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.909329 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn"] Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.932369 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4c5cn\" (UID: \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.932452 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4c5cn\" (UID: \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.932485 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4c5cn\" (UID: \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" Oct 05 20:53:51 crc kubenswrapper[4753]: I1005 20:53:51.932655 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrblc\" (UniqueName: \"kubernetes.io/projected/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-kube-api-access-nrblc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4c5cn\" (UID: \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" Oct 05 20:53:52 crc kubenswrapper[4753]: I1005 20:53:52.035221 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4c5cn\" (UID: \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" Oct 05 20:53:52 crc kubenswrapper[4753]: I1005 20:53:52.036424 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4c5cn\" (UID: \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" Oct 05 20:53:52 crc kubenswrapper[4753]: I1005 20:53:52.036481 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4c5cn\" (UID: \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" Oct 05 20:53:52 crc kubenswrapper[4753]: I1005 20:53:52.036554 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrblc\" (UniqueName: \"kubernetes.io/projected/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-kube-api-access-nrblc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4c5cn\" (UID: \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" Oct 05 20:53:52 crc kubenswrapper[4753]: I1005 20:53:52.042029 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4c5cn\" (UID: \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" Oct 05 20:53:52 crc kubenswrapper[4753]: I1005 20:53:52.043754 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4c5cn\" (UID: \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" Oct 05 20:53:52 crc kubenswrapper[4753]: I1005 20:53:52.042057 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4c5cn\" (UID: \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" Oct 05 20:53:52 crc kubenswrapper[4753]: I1005 20:53:52.054105 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrblc\" (UniqueName: \"kubernetes.io/projected/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-kube-api-access-nrblc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4c5cn\" (UID: \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" Oct 05 20:53:52 crc kubenswrapper[4753]: I1005 20:53:52.221852 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" Oct 05 20:53:52 crc kubenswrapper[4753]: I1005 20:53:52.557582 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn"] Oct 05 20:53:52 crc kubenswrapper[4753]: I1005 20:53:52.573444 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 05 20:53:52 crc kubenswrapper[4753]: I1005 20:53:52.793029 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" event={"ID":"d5a16b03-a799-4548-8a7f-bf73d3f4a52a","Type":"ContainerStarted","Data":"3f3c2d476c07a6deaa64625a276ad3d9b4d09a1cb2b138b2e35fad87a70fe0b1"} Oct 05 20:53:53 crc kubenswrapper[4753]: I1005 20:53:53.806426 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" event={"ID":"d5a16b03-a799-4548-8a7f-bf73d3f4a52a","Type":"ContainerStarted","Data":"7c6176baedf0f2f284de030b2ec202af02fa0faeba2aaf1d7dcac28b25f8ae85"} Oct 05 20:53:53 crc kubenswrapper[4753]: I1005 20:53:53.835234 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" podStartSLOduration=2.380887232 podStartE2EDuration="2.83520335s" podCreationTimestamp="2025-10-05 20:53:51 +0000 UTC" firstStartedPulling="2025-10-05 20:53:52.573224112 +0000 UTC m=+2341.421552344" lastFinishedPulling="2025-10-05 20:53:53.02754019 +0000 UTC m=+2341.875868462" observedRunningTime="2025-10-05 20:53:53.822999672 +0000 UTC m=+2342.671328244" watchObservedRunningTime="2025-10-05 20:53:53.83520335 +0000 UTC m=+2342.683531622" Oct 05 20:54:02 crc kubenswrapper[4753]: I1005 20:54:02.852162 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:54:02 crc kubenswrapper[4753]: E1005 20:54:02.852963 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:54:15 crc kubenswrapper[4753]: I1005 20:54:15.853096 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:54:15 crc kubenswrapper[4753]: E1005 20:54:15.854012 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:54:29 crc kubenswrapper[4753]: I1005 20:54:29.852203 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:54:29 crc kubenswrapper[4753]: E1005 20:54:29.853005 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:54:37 crc kubenswrapper[4753]: I1005 20:54:37.206500 4753 generic.go:334] "Generic (PLEG): container finished" podID="d5a16b03-a799-4548-8a7f-bf73d3f4a52a" containerID="7c6176baedf0f2f284de030b2ec202af02fa0faeba2aaf1d7dcac28b25f8ae85" exitCode=0 Oct 05 20:54:37 crc kubenswrapper[4753]: I1005 20:54:37.206621 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" event={"ID":"d5a16b03-a799-4548-8a7f-bf73d3f4a52a","Type":"ContainerDied","Data":"7c6176baedf0f2f284de030b2ec202af02fa0faeba2aaf1d7dcac28b25f8ae85"} Oct 05 20:54:38 crc kubenswrapper[4753]: I1005 20:54:38.657087 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" Oct 05 20:54:38 crc kubenswrapper[4753]: I1005 20:54:38.803134 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-ceph\") pod \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\" (UID: \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\") " Oct 05 20:54:38 crc kubenswrapper[4753]: I1005 20:54:38.803252 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-inventory\") pod \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\" (UID: \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\") " Oct 05 20:54:38 crc kubenswrapper[4753]: I1005 20:54:38.804036 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrblc\" (UniqueName: \"kubernetes.io/projected/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-kube-api-access-nrblc\") pod \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\" (UID: \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\") " Oct 05 20:54:38 crc kubenswrapper[4753]: I1005 20:54:38.804081 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-ssh-key\") pod \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\" (UID: \"d5a16b03-a799-4548-8a7f-bf73d3f4a52a\") " Oct 05 20:54:38 crc kubenswrapper[4753]: I1005 20:54:38.813469 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-kube-api-access-nrblc" (OuterVolumeSpecName: "kube-api-access-nrblc") pod "d5a16b03-a799-4548-8a7f-bf73d3f4a52a" (UID: "d5a16b03-a799-4548-8a7f-bf73d3f4a52a"). InnerVolumeSpecName "kube-api-access-nrblc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:54:38 crc kubenswrapper[4753]: I1005 20:54:38.814253 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-ceph" (OuterVolumeSpecName: "ceph") pod "d5a16b03-a799-4548-8a7f-bf73d3f4a52a" (UID: "d5a16b03-a799-4548-8a7f-bf73d3f4a52a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:54:38 crc kubenswrapper[4753]: I1005 20:54:38.827359 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-inventory" (OuterVolumeSpecName: "inventory") pod "d5a16b03-a799-4548-8a7f-bf73d3f4a52a" (UID: "d5a16b03-a799-4548-8a7f-bf73d3f4a52a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:54:38 crc kubenswrapper[4753]: I1005 20:54:38.838911 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d5a16b03-a799-4548-8a7f-bf73d3f4a52a" (UID: "d5a16b03-a799-4548-8a7f-bf73d3f4a52a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:54:38 crc kubenswrapper[4753]: I1005 20:54:38.906756 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrblc\" (UniqueName: \"kubernetes.io/projected/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-kube-api-access-nrblc\") on node \"crc\" DevicePath \"\"" Oct 05 20:54:38 crc kubenswrapper[4753]: I1005 20:54:38.906795 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:54:38 crc kubenswrapper[4753]: I1005 20:54:38.906804 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 20:54:38 crc kubenswrapper[4753]: I1005 20:54:38.906811 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5a16b03-a799-4548-8a7f-bf73d3f4a52a-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.227511 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" event={"ID":"d5a16b03-a799-4548-8a7f-bf73d3f4a52a","Type":"ContainerDied","Data":"3f3c2d476c07a6deaa64625a276ad3d9b4d09a1cb2b138b2e35fad87a70fe0b1"} Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.227560 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f3c2d476c07a6deaa64625a276ad3d9b4d09a1cb2b138b2e35fad87a70fe0b1" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.227580 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4c5cn" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.354095 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42"] Oct 05 20:54:39 crc kubenswrapper[4753]: E1005 20:54:39.355857 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a16b03-a799-4548-8a7f-bf73d3f4a52a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.355878 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a16b03-a799-4548-8a7f-bf73d3f4a52a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.356351 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a16b03-a799-4548-8a7f-bf73d3f4a52a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.358669 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.383970 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.391316 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.391367 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.391419 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.391658 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.391956 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42"] Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.413056 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42\" (UID: \"73e27d2b-d430-4df5-9380-e3b3f6a75420\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.413175 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42\" (UID: \"73e27d2b-d430-4df5-9380-e3b3f6a75420\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.413239 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8d82\" (UniqueName: \"kubernetes.io/projected/73e27d2b-d430-4df5-9380-e3b3f6a75420-kube-api-access-j8d82\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42\" (UID: \"73e27d2b-d430-4df5-9380-e3b3f6a75420\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.413268 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42\" (UID: \"73e27d2b-d430-4df5-9380-e3b3f6a75420\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" Oct 05 20:54:39 crc kubenswrapper[4753]: E1005 20:54:39.434635 4753 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5a16b03_a799_4548_8a7f_bf73d3f4a52a.slice/crio-3f3c2d476c07a6deaa64625a276ad3d9b4d09a1cb2b138b2e35fad87a70fe0b1\": RecentStats: unable to find data in memory cache]" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.515154 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42\" (UID: \"73e27d2b-d430-4df5-9380-e3b3f6a75420\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.515229 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42\" (UID: \"73e27d2b-d430-4df5-9380-e3b3f6a75420\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.515279 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8d82\" (UniqueName: \"kubernetes.io/projected/73e27d2b-d430-4df5-9380-e3b3f6a75420-kube-api-access-j8d82\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42\" (UID: \"73e27d2b-d430-4df5-9380-e3b3f6a75420\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.515307 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42\" (UID: \"73e27d2b-d430-4df5-9380-e3b3f6a75420\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.520022 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42\" (UID: \"73e27d2b-d430-4df5-9380-e3b3f6a75420\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.534582 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42\" (UID: \"73e27d2b-d430-4df5-9380-e3b3f6a75420\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.535568 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42\" (UID: \"73e27d2b-d430-4df5-9380-e3b3f6a75420\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.543861 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8d82\" (UniqueName: \"kubernetes.io/projected/73e27d2b-d430-4df5-9380-e3b3f6a75420-kube-api-access-j8d82\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42\" (UID: \"73e27d2b-d430-4df5-9380-e3b3f6a75420\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" Oct 05 20:54:39 crc kubenswrapper[4753]: I1005 20:54:39.684992 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" Oct 05 20:54:40 crc kubenswrapper[4753]: W1005 20:54:40.227725 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73e27d2b_d430_4df5_9380_e3b3f6a75420.slice/crio-c56c13ded92e51792001c795cbbf888cd01dbffd06ff9e05eb78dd0b54be28aa WatchSource:0}: Error finding container c56c13ded92e51792001c795cbbf888cd01dbffd06ff9e05eb78dd0b54be28aa: Status 404 returned error can't find the container with id c56c13ded92e51792001c795cbbf888cd01dbffd06ff9e05eb78dd0b54be28aa Oct 05 20:54:40 crc kubenswrapper[4753]: I1005 20:54:40.228613 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42"] Oct 05 20:54:40 crc kubenswrapper[4753]: I1005 20:54:40.235129 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" event={"ID":"73e27d2b-d430-4df5-9380-e3b3f6a75420","Type":"ContainerStarted","Data":"c56c13ded92e51792001c795cbbf888cd01dbffd06ff9e05eb78dd0b54be28aa"} Oct 05 20:54:41 crc kubenswrapper[4753]: I1005 20:54:41.243516 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" event={"ID":"73e27d2b-d430-4df5-9380-e3b3f6a75420","Type":"ContainerStarted","Data":"b383cfe24f8a929033aa48f029ade15f46e699f060f70db24980fbed3f39958a"} Oct 05 20:54:41 crc kubenswrapper[4753]: I1005 20:54:41.264198 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" podStartSLOduration=1.7299231210000001 podStartE2EDuration="2.264175302s" podCreationTimestamp="2025-10-05 20:54:39 +0000 UTC" firstStartedPulling="2025-10-05 20:54:40.229248379 +0000 UTC m=+2389.077576621" lastFinishedPulling="2025-10-05 20:54:40.76350055 +0000 UTC m=+2389.611828802" observedRunningTime="2025-10-05 20:54:41.260500289 +0000 UTC m=+2390.108828521" watchObservedRunningTime="2025-10-05 20:54:41.264175302 +0000 UTC m=+2390.112503534" Oct 05 20:54:42 crc kubenswrapper[4753]: I1005 20:54:42.852157 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:54:42 crc kubenswrapper[4753]: E1005 20:54:42.852613 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:54:45 crc kubenswrapper[4753]: I1005 20:54:45.275531 4753 generic.go:334] "Generic (PLEG): container finished" podID="73e27d2b-d430-4df5-9380-e3b3f6a75420" containerID="b383cfe24f8a929033aa48f029ade15f46e699f060f70db24980fbed3f39958a" exitCode=0 Oct 05 20:54:45 crc kubenswrapper[4753]: I1005 20:54:45.275634 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" event={"ID":"73e27d2b-d430-4df5-9380-e3b3f6a75420","Type":"ContainerDied","Data":"b383cfe24f8a929033aa48f029ade15f46e699f060f70db24980fbed3f39958a"} Oct 05 20:54:46 crc kubenswrapper[4753]: I1005 20:54:46.692392 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" Oct 05 20:54:46 crc kubenswrapper[4753]: I1005 20:54:46.738632 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-inventory\") pod \"73e27d2b-d430-4df5-9380-e3b3f6a75420\" (UID: \"73e27d2b-d430-4df5-9380-e3b3f6a75420\") " Oct 05 20:54:46 crc kubenswrapper[4753]: I1005 20:54:46.738675 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-ceph\") pod \"73e27d2b-d430-4df5-9380-e3b3f6a75420\" (UID: \"73e27d2b-d430-4df5-9380-e3b3f6a75420\") " Oct 05 20:54:46 crc kubenswrapper[4753]: I1005 20:54:46.738731 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8d82\" (UniqueName: \"kubernetes.io/projected/73e27d2b-d430-4df5-9380-e3b3f6a75420-kube-api-access-j8d82\") pod \"73e27d2b-d430-4df5-9380-e3b3f6a75420\" (UID: \"73e27d2b-d430-4df5-9380-e3b3f6a75420\") " Oct 05 20:54:46 crc kubenswrapper[4753]: I1005 20:54:46.739274 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-ssh-key\") pod \"73e27d2b-d430-4df5-9380-e3b3f6a75420\" (UID: \"73e27d2b-d430-4df5-9380-e3b3f6a75420\") " Oct 05 20:54:46 crc kubenswrapper[4753]: I1005 20:54:46.751234 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-ceph" (OuterVolumeSpecName: "ceph") pod "73e27d2b-d430-4df5-9380-e3b3f6a75420" (UID: "73e27d2b-d430-4df5-9380-e3b3f6a75420"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:54:46 crc kubenswrapper[4753]: I1005 20:54:46.752843 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73e27d2b-d430-4df5-9380-e3b3f6a75420-kube-api-access-j8d82" (OuterVolumeSpecName: "kube-api-access-j8d82") pod "73e27d2b-d430-4df5-9380-e3b3f6a75420" (UID: "73e27d2b-d430-4df5-9380-e3b3f6a75420"). InnerVolumeSpecName "kube-api-access-j8d82". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:54:46 crc kubenswrapper[4753]: I1005 20:54:46.773224 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "73e27d2b-d430-4df5-9380-e3b3f6a75420" (UID: "73e27d2b-d430-4df5-9380-e3b3f6a75420"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:54:46 crc kubenswrapper[4753]: I1005 20:54:46.780038 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-inventory" (OuterVolumeSpecName: "inventory") pod "73e27d2b-d430-4df5-9380-e3b3f6a75420" (UID: "73e27d2b-d430-4df5-9380-e3b3f6a75420"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:54:46 crc kubenswrapper[4753]: I1005 20:54:46.840721 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8d82\" (UniqueName: \"kubernetes.io/projected/73e27d2b-d430-4df5-9380-e3b3f6a75420-kube-api-access-j8d82\") on node \"crc\" DevicePath \"\"" Oct 05 20:54:46 crc kubenswrapper[4753]: I1005 20:54:46.840754 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:54:46 crc kubenswrapper[4753]: I1005 20:54:46.840765 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:54:46 crc kubenswrapper[4753]: I1005 20:54:46.840775 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73e27d2b-d430-4df5-9380-e3b3f6a75420-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.293074 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" event={"ID":"73e27d2b-d430-4df5-9380-e3b3f6a75420","Type":"ContainerDied","Data":"c56c13ded92e51792001c795cbbf888cd01dbffd06ff9e05eb78dd0b54be28aa"} Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.293641 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c56c13ded92e51792001c795cbbf888cd01dbffd06ff9e05eb78dd0b54be28aa" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.293750 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.375637 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj"] Oct 05 20:54:47 crc kubenswrapper[4753]: E1005 20:54:47.376370 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73e27d2b-d430-4df5-9380-e3b3f6a75420" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.376477 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e27d2b-d430-4df5-9380-e3b3f6a75420" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.376811 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="73e27d2b-d430-4df5-9380-e3b3f6a75420" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.377638 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.384074 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj"] Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.385085 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.385129 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.385200 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.385674 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.386008 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.449790 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj\" (UID: \"f21504e5-2012-4b4a-a3fc-16e6dc364373\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.449847 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj\" (UID: \"f21504e5-2012-4b4a-a3fc-16e6dc364373\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.449880 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj\" (UID: \"f21504e5-2012-4b4a-a3fc-16e6dc364373\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.449982 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5226\" (UniqueName: \"kubernetes.io/projected/f21504e5-2012-4b4a-a3fc-16e6dc364373-kube-api-access-l5226\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj\" (UID: \"f21504e5-2012-4b4a-a3fc-16e6dc364373\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.551292 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj\" (UID: \"f21504e5-2012-4b4a-a3fc-16e6dc364373\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.551357 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj\" (UID: \"f21504e5-2012-4b4a-a3fc-16e6dc364373\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.551396 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj\" (UID: \"f21504e5-2012-4b4a-a3fc-16e6dc364373\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.551498 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5226\" (UniqueName: \"kubernetes.io/projected/f21504e5-2012-4b4a-a3fc-16e6dc364373-kube-api-access-l5226\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj\" (UID: \"f21504e5-2012-4b4a-a3fc-16e6dc364373\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.555260 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj\" (UID: \"f21504e5-2012-4b4a-a3fc-16e6dc364373\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.556344 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj\" (UID: \"f21504e5-2012-4b4a-a3fc-16e6dc364373\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.561758 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj\" (UID: \"f21504e5-2012-4b4a-a3fc-16e6dc364373\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.582577 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5226\" (UniqueName: \"kubernetes.io/projected/f21504e5-2012-4b4a-a3fc-16e6dc364373-kube-api-access-l5226\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj\" (UID: \"f21504e5-2012-4b4a-a3fc-16e6dc364373\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" Oct 05 20:54:47 crc kubenswrapper[4753]: I1005 20:54:47.695559 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" Oct 05 20:54:48 crc kubenswrapper[4753]: I1005 20:54:48.218546 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj"] Oct 05 20:54:48 crc kubenswrapper[4753]: I1005 20:54:48.301397 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" event={"ID":"f21504e5-2012-4b4a-a3fc-16e6dc364373","Type":"ContainerStarted","Data":"59656413db51d3e1a1c910127d09c11791e1552797157d75b2985e1671625b7e"} Oct 05 20:54:49 crc kubenswrapper[4753]: I1005 20:54:49.310465 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" event={"ID":"f21504e5-2012-4b4a-a3fc-16e6dc364373","Type":"ContainerStarted","Data":"27deda40dd04ac58b314e020ab9f7ca47ff71672c8311d934208899b64aa79b7"} Oct 05 20:54:49 crc kubenswrapper[4753]: I1005 20:54:49.333327 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" podStartSLOduration=1.876666647 podStartE2EDuration="2.333308416s" podCreationTimestamp="2025-10-05 20:54:47 +0000 UTC" firstStartedPulling="2025-10-05 20:54:48.228638816 +0000 UTC m=+2397.076967048" lastFinishedPulling="2025-10-05 20:54:48.685280585 +0000 UTC m=+2397.533608817" observedRunningTime="2025-10-05 20:54:49.323831884 +0000 UTC m=+2398.172160116" watchObservedRunningTime="2025-10-05 20:54:49.333308416 +0000 UTC m=+2398.181636648" Oct 05 20:54:55 crc kubenswrapper[4753]: I1005 20:54:55.853572 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:54:55 crc kubenswrapper[4753]: E1005 20:54:55.854466 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:55:06 crc kubenswrapper[4753]: I1005 20:55:06.852601 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:55:06 crc kubenswrapper[4753]: E1005 20:55:06.853325 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:55:19 crc kubenswrapper[4753]: I1005 20:55:19.852264 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:55:19 crc kubenswrapper[4753]: E1005 20:55:19.853179 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:55:34 crc kubenswrapper[4753]: I1005 20:55:34.852272 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:55:34 crc kubenswrapper[4753]: E1005 20:55:34.852976 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:55:43 crc kubenswrapper[4753]: I1005 20:55:43.795387 4753 generic.go:334] "Generic (PLEG): container finished" podID="f21504e5-2012-4b4a-a3fc-16e6dc364373" containerID="27deda40dd04ac58b314e020ab9f7ca47ff71672c8311d934208899b64aa79b7" exitCode=0 Oct 05 20:55:43 crc kubenswrapper[4753]: I1005 20:55:43.795523 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" event={"ID":"f21504e5-2012-4b4a-a3fc-16e6dc364373","Type":"ContainerDied","Data":"27deda40dd04ac58b314e020ab9f7ca47ff71672c8311d934208899b64aa79b7"} Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.183678 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.318027 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-ceph\") pod \"f21504e5-2012-4b4a-a3fc-16e6dc364373\" (UID: \"f21504e5-2012-4b4a-a3fc-16e6dc364373\") " Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.318099 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-inventory\") pod \"f21504e5-2012-4b4a-a3fc-16e6dc364373\" (UID: \"f21504e5-2012-4b4a-a3fc-16e6dc364373\") " Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.318150 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-ssh-key\") pod \"f21504e5-2012-4b4a-a3fc-16e6dc364373\" (UID: \"f21504e5-2012-4b4a-a3fc-16e6dc364373\") " Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.318271 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5226\" (UniqueName: \"kubernetes.io/projected/f21504e5-2012-4b4a-a3fc-16e6dc364373-kube-api-access-l5226\") pod \"f21504e5-2012-4b4a-a3fc-16e6dc364373\" (UID: \"f21504e5-2012-4b4a-a3fc-16e6dc364373\") " Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.325455 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f21504e5-2012-4b4a-a3fc-16e6dc364373-kube-api-access-l5226" (OuterVolumeSpecName: "kube-api-access-l5226") pod "f21504e5-2012-4b4a-a3fc-16e6dc364373" (UID: "f21504e5-2012-4b4a-a3fc-16e6dc364373"). InnerVolumeSpecName "kube-api-access-l5226". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.325998 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-ceph" (OuterVolumeSpecName: "ceph") pod "f21504e5-2012-4b4a-a3fc-16e6dc364373" (UID: "f21504e5-2012-4b4a-a3fc-16e6dc364373"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.355465 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-inventory" (OuterVolumeSpecName: "inventory") pod "f21504e5-2012-4b4a-a3fc-16e6dc364373" (UID: "f21504e5-2012-4b4a-a3fc-16e6dc364373"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.356036 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f21504e5-2012-4b4a-a3fc-16e6dc364373" (UID: "f21504e5-2012-4b4a-a3fc-16e6dc364373"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.420900 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5226\" (UniqueName: \"kubernetes.io/projected/f21504e5-2012-4b4a-a3fc-16e6dc364373-kube-api-access-l5226\") on node \"crc\" DevicePath \"\"" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.420931 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.420941 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.420951 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f21504e5-2012-4b4a-a3fc-16e6dc364373-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.812654 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.812656 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj" event={"ID":"f21504e5-2012-4b4a-a3fc-16e6dc364373","Type":"ContainerDied","Data":"59656413db51d3e1a1c910127d09c11791e1552797157d75b2985e1671625b7e"} Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.812699 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59656413db51d3e1a1c910127d09c11791e1552797157d75b2985e1671625b7e" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.906021 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-n4pww"] Oct 05 20:55:45 crc kubenswrapper[4753]: E1005 20:55:45.906441 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21504e5-2012-4b4a-a3fc-16e6dc364373" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.906463 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21504e5-2012-4b4a-a3fc-16e6dc364373" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.906691 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f21504e5-2012-4b4a-a3fc-16e6dc364373" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.907408 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.913334 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.913334 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.913915 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.914184 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.914465 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:55:45 crc kubenswrapper[4753]: I1005 20:55:45.921473 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-n4pww"] Oct 05 20:55:46 crc kubenswrapper[4753]: I1005 20:55:46.037360 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh8pb\" (UniqueName: \"kubernetes.io/projected/5d24d938-36bb-4d7b-94e6-f0332f50a71a-kube-api-access-nh8pb\") pod \"ssh-known-hosts-edpm-deployment-n4pww\" (UID: \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\") " pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" Oct 05 20:55:46 crc kubenswrapper[4753]: I1005 20:55:46.037655 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-n4pww\" (UID: \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\") " pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" Oct 05 20:55:46 crc kubenswrapper[4753]: I1005 20:55:46.037851 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-ceph\") pod \"ssh-known-hosts-edpm-deployment-n4pww\" (UID: \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\") " pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" Oct 05 20:55:46 crc kubenswrapper[4753]: I1005 20:55:46.037947 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-n4pww\" (UID: \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\") " pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" Oct 05 20:55:46 crc kubenswrapper[4753]: I1005 20:55:46.140392 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-ceph\") pod \"ssh-known-hosts-edpm-deployment-n4pww\" (UID: \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\") " pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" Oct 05 20:55:46 crc kubenswrapper[4753]: I1005 20:55:46.140473 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-n4pww\" (UID: \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\") " pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" Oct 05 20:55:46 crc kubenswrapper[4753]: I1005 20:55:46.140568 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh8pb\" (UniqueName: \"kubernetes.io/projected/5d24d938-36bb-4d7b-94e6-f0332f50a71a-kube-api-access-nh8pb\") pod \"ssh-known-hosts-edpm-deployment-n4pww\" (UID: \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\") " pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" Oct 05 20:55:46 crc kubenswrapper[4753]: I1005 20:55:46.140635 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-n4pww\" (UID: \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\") " pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" Oct 05 20:55:46 crc kubenswrapper[4753]: I1005 20:55:46.144781 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-n4pww\" (UID: \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\") " pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" Oct 05 20:55:46 crc kubenswrapper[4753]: I1005 20:55:46.145756 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-n4pww\" (UID: \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\") " pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" Oct 05 20:55:46 crc kubenswrapper[4753]: I1005 20:55:46.153949 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-ceph\") pod \"ssh-known-hosts-edpm-deployment-n4pww\" (UID: \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\") " pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" Oct 05 20:55:46 crc kubenswrapper[4753]: I1005 20:55:46.160887 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh8pb\" (UniqueName: \"kubernetes.io/projected/5d24d938-36bb-4d7b-94e6-f0332f50a71a-kube-api-access-nh8pb\") pod \"ssh-known-hosts-edpm-deployment-n4pww\" (UID: \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\") " pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" Oct 05 20:55:46 crc kubenswrapper[4753]: I1005 20:55:46.224563 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" Oct 05 20:55:46 crc kubenswrapper[4753]: I1005 20:55:46.922534 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-n4pww"] Oct 05 20:55:47 crc kubenswrapper[4753]: I1005 20:55:47.836330 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" event={"ID":"5d24d938-36bb-4d7b-94e6-f0332f50a71a","Type":"ContainerStarted","Data":"df1b9bf9aa6174ef347d6e780b74d361afcbff2bba9a8fe4c39e1e4989bfacc3"} Oct 05 20:55:47 crc kubenswrapper[4753]: I1005 20:55:47.836700 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" event={"ID":"5d24d938-36bb-4d7b-94e6-f0332f50a71a","Type":"ContainerStarted","Data":"6078bf38176f48e2bbc28ce52030136466f239b71ba9953ede8ebf40069f4588"} Oct 05 20:55:47 crc kubenswrapper[4753]: I1005 20:55:47.852705 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" podStartSLOduration=2.259409399 podStartE2EDuration="2.852687617s" podCreationTimestamp="2025-10-05 20:55:45 +0000 UTC" firstStartedPulling="2025-10-05 20:55:46.934341031 +0000 UTC m=+2455.782669263" lastFinishedPulling="2025-10-05 20:55:47.527619249 +0000 UTC m=+2456.375947481" observedRunningTime="2025-10-05 20:55:47.848712093 +0000 UTC m=+2456.697040325" watchObservedRunningTime="2025-10-05 20:55:47.852687617 +0000 UTC m=+2456.701015849" Oct 05 20:55:48 crc kubenswrapper[4753]: I1005 20:55:48.852159 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:55:48 crc kubenswrapper[4753]: E1005 20:55:48.852766 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:55:57 crc kubenswrapper[4753]: I1005 20:55:57.937726 4753 generic.go:334] "Generic (PLEG): container finished" podID="5d24d938-36bb-4d7b-94e6-f0332f50a71a" containerID="df1b9bf9aa6174ef347d6e780b74d361afcbff2bba9a8fe4c39e1e4989bfacc3" exitCode=0 Oct 05 20:55:57 crc kubenswrapper[4753]: I1005 20:55:57.937820 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" event={"ID":"5d24d938-36bb-4d7b-94e6-f0332f50a71a","Type":"ContainerDied","Data":"df1b9bf9aa6174ef347d6e780b74d361afcbff2bba9a8fe4c39e1e4989bfacc3"} Oct 05 20:55:59 crc kubenswrapper[4753]: I1005 20:55:59.350919 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" Oct 05 20:55:59 crc kubenswrapper[4753]: I1005 20:55:59.369271 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh8pb\" (UniqueName: \"kubernetes.io/projected/5d24d938-36bb-4d7b-94e6-f0332f50a71a-kube-api-access-nh8pb\") pod \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\" (UID: \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\") " Oct 05 20:55:59 crc kubenswrapper[4753]: I1005 20:55:59.371225 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-ssh-key-openstack-edpm-ipam\") pod \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\" (UID: \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\") " Oct 05 20:55:59 crc kubenswrapper[4753]: I1005 20:55:59.371340 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-inventory-0\") pod \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\" (UID: \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\") " Oct 05 20:55:59 crc kubenswrapper[4753]: I1005 20:55:59.371463 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-ceph\") pod \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\" (UID: \"5d24d938-36bb-4d7b-94e6-f0332f50a71a\") " Oct 05 20:55:59 crc kubenswrapper[4753]: I1005 20:55:59.381938 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-ceph" (OuterVolumeSpecName: "ceph") pod "5d24d938-36bb-4d7b-94e6-f0332f50a71a" (UID: "5d24d938-36bb-4d7b-94e6-f0332f50a71a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:55:59 crc kubenswrapper[4753]: I1005 20:55:59.390888 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d24d938-36bb-4d7b-94e6-f0332f50a71a-kube-api-access-nh8pb" (OuterVolumeSpecName: "kube-api-access-nh8pb") pod "5d24d938-36bb-4d7b-94e6-f0332f50a71a" (UID: "5d24d938-36bb-4d7b-94e6-f0332f50a71a"). InnerVolumeSpecName "kube-api-access-nh8pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:55:59 crc kubenswrapper[4753]: I1005 20:55:59.400856 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5d24d938-36bb-4d7b-94e6-f0332f50a71a" (UID: "5d24d938-36bb-4d7b-94e6-f0332f50a71a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:55:59 crc kubenswrapper[4753]: I1005 20:55:59.402269 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "5d24d938-36bb-4d7b-94e6-f0332f50a71a" (UID: "5d24d938-36bb-4d7b-94e6-f0332f50a71a"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:55:59 crc kubenswrapper[4753]: I1005 20:55:59.476002 4753 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 05 20:55:59 crc kubenswrapper[4753]: I1005 20:55:59.476038 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 20:55:59 crc kubenswrapper[4753]: I1005 20:55:59.476049 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh8pb\" (UniqueName: \"kubernetes.io/projected/5d24d938-36bb-4d7b-94e6-f0332f50a71a-kube-api-access-nh8pb\") on node \"crc\" DevicePath \"\"" Oct 05 20:55:59 crc kubenswrapper[4753]: I1005 20:55:59.476064 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d24d938-36bb-4d7b-94e6-f0332f50a71a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 05 20:55:59 crc kubenswrapper[4753]: I1005 20:55:59.960714 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" event={"ID":"5d24d938-36bb-4d7b-94e6-f0332f50a71a","Type":"ContainerDied","Data":"6078bf38176f48e2bbc28ce52030136466f239b71ba9953ede8ebf40069f4588"} Oct 05 20:55:59 crc kubenswrapper[4753]: I1005 20:55:59.960788 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6078bf38176f48e2bbc28ce52030136466f239b71ba9953ede8ebf40069f4588" Oct 05 20:55:59 crc kubenswrapper[4753]: I1005 20:55:59.960899 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-n4pww" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.062424 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg"] Oct 05 20:56:00 crc kubenswrapper[4753]: E1005 20:56:00.062954 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d24d938-36bb-4d7b-94e6-f0332f50a71a" containerName="ssh-known-hosts-edpm-deployment" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.062985 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d24d938-36bb-4d7b-94e6-f0332f50a71a" containerName="ssh-known-hosts-edpm-deployment" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.063259 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d24d938-36bb-4d7b-94e6-f0332f50a71a" containerName="ssh-known-hosts-edpm-deployment" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.063936 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.075735 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg"] Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.077014 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.077257 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.101868 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.102354 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.103224 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.105503 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vqmtg\" (UID: \"1073a302-b108-4caa-aa77-78d64fd8f169\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.105783 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vqmtg\" (UID: \"1073a302-b108-4caa-aa77-78d64fd8f169\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.105898 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9ttq\" (UniqueName: \"kubernetes.io/projected/1073a302-b108-4caa-aa77-78d64fd8f169-kube-api-access-b9ttq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vqmtg\" (UID: \"1073a302-b108-4caa-aa77-78d64fd8f169\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.106014 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vqmtg\" (UID: \"1073a302-b108-4caa-aa77-78d64fd8f169\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.208009 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vqmtg\" (UID: \"1073a302-b108-4caa-aa77-78d64fd8f169\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.208367 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9ttq\" (UniqueName: \"kubernetes.io/projected/1073a302-b108-4caa-aa77-78d64fd8f169-kube-api-access-b9ttq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vqmtg\" (UID: \"1073a302-b108-4caa-aa77-78d64fd8f169\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.208715 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vqmtg\" (UID: \"1073a302-b108-4caa-aa77-78d64fd8f169\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.209163 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vqmtg\" (UID: \"1073a302-b108-4caa-aa77-78d64fd8f169\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.214811 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vqmtg\" (UID: \"1073a302-b108-4caa-aa77-78d64fd8f169\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.216229 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vqmtg\" (UID: \"1073a302-b108-4caa-aa77-78d64fd8f169\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.217084 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vqmtg\" (UID: \"1073a302-b108-4caa-aa77-78d64fd8f169\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.227199 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9ttq\" (UniqueName: \"kubernetes.io/projected/1073a302-b108-4caa-aa77-78d64fd8f169-kube-api-access-b9ttq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vqmtg\" (UID: \"1073a302-b108-4caa-aa77-78d64fd8f169\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" Oct 05 20:56:00 crc kubenswrapper[4753]: I1005 20:56:00.427473 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" Oct 05 20:56:01 crc kubenswrapper[4753]: I1005 20:56:01.000526 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg"] Oct 05 20:56:01 crc kubenswrapper[4753]: W1005 20:56:01.011542 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1073a302_b108_4caa_aa77_78d64fd8f169.slice/crio-beba773b090a5ec2e27faaec4f308b69edf980998b83ce5a66aa3df679a1d227 WatchSource:0}: Error finding container beba773b090a5ec2e27faaec4f308b69edf980998b83ce5a66aa3df679a1d227: Status 404 returned error can't find the container with id beba773b090a5ec2e27faaec4f308b69edf980998b83ce5a66aa3df679a1d227 Oct 05 20:56:01 crc kubenswrapper[4753]: I1005 20:56:01.976117 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" event={"ID":"1073a302-b108-4caa-aa77-78d64fd8f169","Type":"ContainerStarted","Data":"1370c2fddc760b4b3a727392838f00719eaba69e6f2622d036a594edde500a86"} Oct 05 20:56:01 crc kubenswrapper[4753]: I1005 20:56:01.976555 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" event={"ID":"1073a302-b108-4caa-aa77-78d64fd8f169","Type":"ContainerStarted","Data":"beba773b090a5ec2e27faaec4f308b69edf980998b83ce5a66aa3df679a1d227"} Oct 05 20:56:02 crc kubenswrapper[4753]: I1005 20:56:02.852541 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:56:02 crc kubenswrapper[4753]: E1005 20:56:02.852864 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:56:10 crc kubenswrapper[4753]: I1005 20:56:10.038220 4753 generic.go:334] "Generic (PLEG): container finished" podID="1073a302-b108-4caa-aa77-78d64fd8f169" containerID="1370c2fddc760b4b3a727392838f00719eaba69e6f2622d036a594edde500a86" exitCode=0 Oct 05 20:56:10 crc kubenswrapper[4753]: I1005 20:56:10.038329 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" event={"ID":"1073a302-b108-4caa-aa77-78d64fd8f169","Type":"ContainerDied","Data":"1370c2fddc760b4b3a727392838f00719eaba69e6f2622d036a594edde500a86"} Oct 05 20:56:11 crc kubenswrapper[4753]: I1005 20:56:11.632181 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" Oct 05 20:56:11 crc kubenswrapper[4753]: I1005 20:56:11.719086 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9ttq\" (UniqueName: \"kubernetes.io/projected/1073a302-b108-4caa-aa77-78d64fd8f169-kube-api-access-b9ttq\") pod \"1073a302-b108-4caa-aa77-78d64fd8f169\" (UID: \"1073a302-b108-4caa-aa77-78d64fd8f169\") " Oct 05 20:56:11 crc kubenswrapper[4753]: I1005 20:56:11.719391 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-ssh-key\") pod \"1073a302-b108-4caa-aa77-78d64fd8f169\" (UID: \"1073a302-b108-4caa-aa77-78d64fd8f169\") " Oct 05 20:56:11 crc kubenswrapper[4753]: I1005 20:56:11.719489 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-inventory\") pod \"1073a302-b108-4caa-aa77-78d64fd8f169\" (UID: \"1073a302-b108-4caa-aa77-78d64fd8f169\") " Oct 05 20:56:11 crc kubenswrapper[4753]: I1005 20:56:11.719654 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-ceph\") pod \"1073a302-b108-4caa-aa77-78d64fd8f169\" (UID: \"1073a302-b108-4caa-aa77-78d64fd8f169\") " Oct 05 20:56:11 crc kubenswrapper[4753]: I1005 20:56:11.727300 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1073a302-b108-4caa-aa77-78d64fd8f169-kube-api-access-b9ttq" (OuterVolumeSpecName: "kube-api-access-b9ttq") pod "1073a302-b108-4caa-aa77-78d64fd8f169" (UID: "1073a302-b108-4caa-aa77-78d64fd8f169"). InnerVolumeSpecName "kube-api-access-b9ttq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:56:11 crc kubenswrapper[4753]: I1005 20:56:11.732280 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-ceph" (OuterVolumeSpecName: "ceph") pod "1073a302-b108-4caa-aa77-78d64fd8f169" (UID: "1073a302-b108-4caa-aa77-78d64fd8f169"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:56:11 crc kubenswrapper[4753]: I1005 20:56:11.766005 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-inventory" (OuterVolumeSpecName: "inventory") pod "1073a302-b108-4caa-aa77-78d64fd8f169" (UID: "1073a302-b108-4caa-aa77-78d64fd8f169"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:56:11 crc kubenswrapper[4753]: I1005 20:56:11.766519 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1073a302-b108-4caa-aa77-78d64fd8f169" (UID: "1073a302-b108-4caa-aa77-78d64fd8f169"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:56:11 crc kubenswrapper[4753]: I1005 20:56:11.821986 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 20:56:11 crc kubenswrapper[4753]: I1005 20:56:11.822024 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9ttq\" (UniqueName: \"kubernetes.io/projected/1073a302-b108-4caa-aa77-78d64fd8f169-kube-api-access-b9ttq\") on node \"crc\" DevicePath \"\"" Oct 05 20:56:11 crc kubenswrapper[4753]: I1005 20:56:11.822039 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:56:11 crc kubenswrapper[4753]: I1005 20:56:11.822052 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1073a302-b108-4caa-aa77-78d64fd8f169-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.058297 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" event={"ID":"1073a302-b108-4caa-aa77-78d64fd8f169","Type":"ContainerDied","Data":"beba773b090a5ec2e27faaec4f308b69edf980998b83ce5a66aa3df679a1d227"} Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.058350 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beba773b090a5ec2e27faaec4f308b69edf980998b83ce5a66aa3df679a1d227" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.058611 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vqmtg" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.146821 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s"] Oct 05 20:56:12 crc kubenswrapper[4753]: E1005 20:56:12.147581 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1073a302-b108-4caa-aa77-78d64fd8f169" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.147680 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1073a302-b108-4caa-aa77-78d64fd8f169" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.147956 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1073a302-b108-4caa-aa77-78d64fd8f169" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.148924 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.150909 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.151408 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.151725 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.152491 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.152790 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.163754 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s"] Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.229391 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s\" (UID: \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.229486 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjrqr\" (UniqueName: \"kubernetes.io/projected/dd50c8ec-c247-4691-9c6d-6c72c1e89227-kube-api-access-wjrqr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s\" (UID: \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.229535 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s\" (UID: \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.229563 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s\" (UID: \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.331318 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s\" (UID: \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.331402 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjrqr\" (UniqueName: \"kubernetes.io/projected/dd50c8ec-c247-4691-9c6d-6c72c1e89227-kube-api-access-wjrqr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s\" (UID: \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.331449 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s\" (UID: \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.331483 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s\" (UID: \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.334980 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s\" (UID: \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.335647 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s\" (UID: \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.336841 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s\" (UID: \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.348216 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjrqr\" (UniqueName: \"kubernetes.io/projected/dd50c8ec-c247-4691-9c6d-6c72c1e89227-kube-api-access-wjrqr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s\" (UID: \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" Oct 05 20:56:12 crc kubenswrapper[4753]: I1005 20:56:12.467364 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" Oct 05 20:56:13 crc kubenswrapper[4753]: I1005 20:56:13.019010 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s"] Oct 05 20:56:13 crc kubenswrapper[4753]: I1005 20:56:13.067022 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" event={"ID":"dd50c8ec-c247-4691-9c6d-6c72c1e89227","Type":"ContainerStarted","Data":"71dff2e0c718e1891fe7e244fd8928431346d7e496358c1a20da3ae92df4dcc9"} Oct 05 20:56:14 crc kubenswrapper[4753]: I1005 20:56:14.076820 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" event={"ID":"dd50c8ec-c247-4691-9c6d-6c72c1e89227","Type":"ContainerStarted","Data":"7063b7a893bbf1ad4f8701438f5cd22df1adaa77db977baaf9b8e17a2840b85d"} Oct 05 20:56:14 crc kubenswrapper[4753]: I1005 20:56:14.105802 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" podStartSLOduration=1.680035435 podStartE2EDuration="2.105779128s" podCreationTimestamp="2025-10-05 20:56:12 +0000 UTC" firstStartedPulling="2025-10-05 20:56:13.026017258 +0000 UTC m=+2481.874345530" lastFinishedPulling="2025-10-05 20:56:13.451760991 +0000 UTC m=+2482.300089223" observedRunningTime="2025-10-05 20:56:14.102565709 +0000 UTC m=+2482.950893951" watchObservedRunningTime="2025-10-05 20:56:14.105779128 +0000 UTC m=+2482.954107370" Oct 05 20:56:17 crc kubenswrapper[4753]: I1005 20:56:17.852544 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:56:17 crc kubenswrapper[4753]: E1005 20:56:17.853701 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:56:24 crc kubenswrapper[4753]: I1005 20:56:24.157069 4753 generic.go:334] "Generic (PLEG): container finished" podID="dd50c8ec-c247-4691-9c6d-6c72c1e89227" containerID="7063b7a893bbf1ad4f8701438f5cd22df1adaa77db977baaf9b8e17a2840b85d" exitCode=0 Oct 05 20:56:24 crc kubenswrapper[4753]: I1005 20:56:24.157075 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" event={"ID":"dd50c8ec-c247-4691-9c6d-6c72c1e89227","Type":"ContainerDied","Data":"7063b7a893bbf1ad4f8701438f5cd22df1adaa77db977baaf9b8e17a2840b85d"} Oct 05 20:56:25 crc kubenswrapper[4753]: I1005 20:56:25.567438 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" Oct 05 20:56:25 crc kubenswrapper[4753]: I1005 20:56:25.666013 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjrqr\" (UniqueName: \"kubernetes.io/projected/dd50c8ec-c247-4691-9c6d-6c72c1e89227-kube-api-access-wjrqr\") pod \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\" (UID: \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\") " Oct 05 20:56:25 crc kubenswrapper[4753]: I1005 20:56:25.666496 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-inventory\") pod \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\" (UID: \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\") " Oct 05 20:56:25 crc kubenswrapper[4753]: I1005 20:56:25.666546 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-ceph\") pod \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\" (UID: \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\") " Oct 05 20:56:25 crc kubenswrapper[4753]: I1005 20:56:25.666569 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-ssh-key\") pod \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\" (UID: \"dd50c8ec-c247-4691-9c6d-6c72c1e89227\") " Oct 05 20:56:25 crc kubenswrapper[4753]: I1005 20:56:25.675512 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd50c8ec-c247-4691-9c6d-6c72c1e89227-kube-api-access-wjrqr" (OuterVolumeSpecName: "kube-api-access-wjrqr") pod "dd50c8ec-c247-4691-9c6d-6c72c1e89227" (UID: "dd50c8ec-c247-4691-9c6d-6c72c1e89227"). InnerVolumeSpecName "kube-api-access-wjrqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:56:25 crc kubenswrapper[4753]: I1005 20:56:25.679206 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-ceph" (OuterVolumeSpecName: "ceph") pod "dd50c8ec-c247-4691-9c6d-6c72c1e89227" (UID: "dd50c8ec-c247-4691-9c6d-6c72c1e89227"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:56:25 crc kubenswrapper[4753]: I1005 20:56:25.694186 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-inventory" (OuterVolumeSpecName: "inventory") pod "dd50c8ec-c247-4691-9c6d-6c72c1e89227" (UID: "dd50c8ec-c247-4691-9c6d-6c72c1e89227"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:56:25 crc kubenswrapper[4753]: I1005 20:56:25.714248 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dd50c8ec-c247-4691-9c6d-6c72c1e89227" (UID: "dd50c8ec-c247-4691-9c6d-6c72c1e89227"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:56:25 crc kubenswrapper[4753]: I1005 20:56:25.769264 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjrqr\" (UniqueName: \"kubernetes.io/projected/dd50c8ec-c247-4691-9c6d-6c72c1e89227-kube-api-access-wjrqr\") on node \"crc\" DevicePath \"\"" Oct 05 20:56:25 crc kubenswrapper[4753]: I1005 20:56:25.769327 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:56:25 crc kubenswrapper[4753]: I1005 20:56:25.769339 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 20:56:25 crc kubenswrapper[4753]: I1005 20:56:25.769350 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd50c8ec-c247-4691-9c6d-6c72c1e89227-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.182391 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" event={"ID":"dd50c8ec-c247-4691-9c6d-6c72c1e89227","Type":"ContainerDied","Data":"71dff2e0c718e1891fe7e244fd8928431346d7e496358c1a20da3ae92df4dcc9"} Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.182437 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71dff2e0c718e1891fe7e244fd8928431346d7e496358c1a20da3ae92df4dcc9" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.182512 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.285291 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd"] Oct 05 20:56:26 crc kubenswrapper[4753]: E1005 20:56:26.285816 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd50c8ec-c247-4691-9c6d-6c72c1e89227" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.285847 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd50c8ec-c247-4691-9c6d-6c72c1e89227" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.286222 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd50c8ec-c247-4691-9c6d-6c72c1e89227" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.287253 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.290886 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.290977 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.290977 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.291117 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.291229 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.291630 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.292960 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.296459 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.300716 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd"] Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.381126 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.381191 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.381223 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.381253 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.381274 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.381302 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.381408 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.381444 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.381470 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.381492 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.381602 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.381683 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n5ck\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-kube-api-access-7n5ck\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.381863 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.483053 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.483120 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.483195 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.483233 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.483259 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.483283 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.483376 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.483407 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.483437 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.483460 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.483488 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.483521 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n5ck\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-kube-api-access-7n5ck\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.483562 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.487994 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.488471 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.490304 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.490412 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.492975 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.493736 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.494257 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.495545 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.495629 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.498129 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.502337 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.510611 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.511499 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n5ck\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-kube-api-access-7n5ck\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:26 crc kubenswrapper[4753]: I1005 20:56:26.607588 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:56:27 crc kubenswrapper[4753]: I1005 20:56:27.161858 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd"] Oct 05 20:56:27 crc kubenswrapper[4753]: I1005 20:56:27.195477 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" event={"ID":"13f3a6ea-8a17-4bf8-a252-f53e5856466a","Type":"ContainerStarted","Data":"1863e2351f403a216c209b6efd3413a5792267176264c2dc4b6baf963718c976"} Oct 05 20:56:28 crc kubenswrapper[4753]: I1005 20:56:28.204557 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" event={"ID":"13f3a6ea-8a17-4bf8-a252-f53e5856466a","Type":"ContainerStarted","Data":"d5df69f146904fe680c98059c9b4f8b862faa74e6103ebe8c27a9c156cdd06f5"} Oct 05 20:56:28 crc kubenswrapper[4753]: I1005 20:56:28.237597 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" podStartSLOduration=1.807256558 podStartE2EDuration="2.237579103s" podCreationTimestamp="2025-10-05 20:56:26 +0000 UTC" firstStartedPulling="2025-10-05 20:56:27.180784433 +0000 UTC m=+2496.029112665" lastFinishedPulling="2025-10-05 20:56:27.611106978 +0000 UTC m=+2496.459435210" observedRunningTime="2025-10-05 20:56:28.231290398 +0000 UTC m=+2497.079618650" watchObservedRunningTime="2025-10-05 20:56:28.237579103 +0000 UTC m=+2497.085907345" Oct 05 20:56:28 crc kubenswrapper[4753]: I1005 20:56:28.851965 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:56:28 crc kubenswrapper[4753]: E1005 20:56:28.852261 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:56:40 crc kubenswrapper[4753]: I1005 20:56:40.852534 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:56:40 crc kubenswrapper[4753]: E1005 20:56:40.853298 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:56:54 crc kubenswrapper[4753]: I1005 20:56:54.852164 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:56:54 crc kubenswrapper[4753]: E1005 20:56:54.852944 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:57:05 crc kubenswrapper[4753]: I1005 20:57:05.525838 4753 generic.go:334] "Generic (PLEG): container finished" podID="13f3a6ea-8a17-4bf8-a252-f53e5856466a" containerID="d5df69f146904fe680c98059c9b4f8b862faa74e6103ebe8c27a9c156cdd06f5" exitCode=0 Oct 05 20:57:05 crc kubenswrapper[4753]: I1005 20:57:05.526439 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" event={"ID":"13f3a6ea-8a17-4bf8-a252-f53e5856466a","Type":"ContainerDied","Data":"d5df69f146904fe680c98059c9b4f8b862faa74e6103ebe8c27a9c156cdd06f5"} Oct 05 20:57:06 crc kubenswrapper[4753]: I1005 20:57:06.939321 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.085018 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-nova-combined-ca-bundle\") pod \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.085076 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n5ck\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-kube-api-access-7n5ck\") pod \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.085161 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-repo-setup-combined-ca-bundle\") pod \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.085189 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.085222 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.085273 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-bootstrap-combined-ca-bundle\") pod \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.085303 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.085357 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-inventory\") pod \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.085390 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ovn-combined-ca-bundle\") pod \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.085424 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-neutron-metadata-combined-ca-bundle\") pod \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.085464 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ssh-key\") pod \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.085513 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-libvirt-combined-ca-bundle\") pod \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.085542 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ceph\") pod \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\" (UID: \"13f3a6ea-8a17-4bf8-a252-f53e5856466a\") " Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.091527 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-kube-api-access-7n5ck" (OuterVolumeSpecName: "kube-api-access-7n5ck") pod "13f3a6ea-8a17-4bf8-a252-f53e5856466a" (UID: "13f3a6ea-8a17-4bf8-a252-f53e5856466a"). InnerVolumeSpecName "kube-api-access-7n5ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.091578 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "13f3a6ea-8a17-4bf8-a252-f53e5856466a" (UID: "13f3a6ea-8a17-4bf8-a252-f53e5856466a"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.091733 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "13f3a6ea-8a17-4bf8-a252-f53e5856466a" (UID: "13f3a6ea-8a17-4bf8-a252-f53e5856466a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.093395 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "13f3a6ea-8a17-4bf8-a252-f53e5856466a" (UID: "13f3a6ea-8a17-4bf8-a252-f53e5856466a"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.093540 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "13f3a6ea-8a17-4bf8-a252-f53e5856466a" (UID: "13f3a6ea-8a17-4bf8-a252-f53e5856466a"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.094817 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "13f3a6ea-8a17-4bf8-a252-f53e5856466a" (UID: "13f3a6ea-8a17-4bf8-a252-f53e5856466a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.095104 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "13f3a6ea-8a17-4bf8-a252-f53e5856466a" (UID: "13f3a6ea-8a17-4bf8-a252-f53e5856466a"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.096883 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "13f3a6ea-8a17-4bf8-a252-f53e5856466a" (UID: "13f3a6ea-8a17-4bf8-a252-f53e5856466a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.097428 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "13f3a6ea-8a17-4bf8-a252-f53e5856466a" (UID: "13f3a6ea-8a17-4bf8-a252-f53e5856466a"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.097864 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ceph" (OuterVolumeSpecName: "ceph") pod "13f3a6ea-8a17-4bf8-a252-f53e5856466a" (UID: "13f3a6ea-8a17-4bf8-a252-f53e5856466a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.107436 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "13f3a6ea-8a17-4bf8-a252-f53e5856466a" (UID: "13f3a6ea-8a17-4bf8-a252-f53e5856466a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.115010 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "13f3a6ea-8a17-4bf8-a252-f53e5856466a" (UID: "13f3a6ea-8a17-4bf8-a252-f53e5856466a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.117152 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-inventory" (OuterVolumeSpecName: "inventory") pod "13f3a6ea-8a17-4bf8-a252-f53e5856466a" (UID: "13f3a6ea-8a17-4bf8-a252-f53e5856466a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.199115 4753 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.199162 4753 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.199182 4753 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.199201 4753 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.199216 4753 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.199230 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.199241 4753 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.199254 4753 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.199265 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.199277 4753 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.199288 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.199300 4753 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f3a6ea-8a17-4bf8-a252-f53e5856466a-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.199311 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n5ck\" (UniqueName: \"kubernetes.io/projected/13f3a6ea-8a17-4bf8-a252-f53e5856466a-kube-api-access-7n5ck\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.542946 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" event={"ID":"13f3a6ea-8a17-4bf8-a252-f53e5856466a","Type":"ContainerDied","Data":"1863e2351f403a216c209b6efd3413a5792267176264c2dc4b6baf963718c976"} Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.542980 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1863e2351f403a216c209b6efd3413a5792267176264c2dc4b6baf963718c976" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.543016 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.645497 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj"] Oct 05 20:57:07 crc kubenswrapper[4753]: E1005 20:57:07.645860 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f3a6ea-8a17-4bf8-a252-f53e5856466a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.645876 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f3a6ea-8a17-4bf8-a252-f53e5856466a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.646033 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="13f3a6ea-8a17-4bf8-a252-f53e5856466a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.646599 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.649649 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.649761 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.649826 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.649967 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.650074 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.670984 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj"] Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.808774 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj\" (UID: \"a3425a91-733d-43c0-b7af-42914da99374\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.808810 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj\" (UID: \"a3425a91-733d-43c0-b7af-42914da99374\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.808841 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj\" (UID: \"a3425a91-733d-43c0-b7af-42914da99374\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.808969 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfkcp\" (UniqueName: \"kubernetes.io/projected/a3425a91-733d-43c0-b7af-42914da99374-kube-api-access-zfkcp\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj\" (UID: \"a3425a91-733d-43c0-b7af-42914da99374\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.851708 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:57:07 crc kubenswrapper[4753]: E1005 20:57:07.851966 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.910270 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfkcp\" (UniqueName: \"kubernetes.io/projected/a3425a91-733d-43c0-b7af-42914da99374-kube-api-access-zfkcp\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj\" (UID: \"a3425a91-733d-43c0-b7af-42914da99374\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.910362 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj\" (UID: \"a3425a91-733d-43c0-b7af-42914da99374\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.910396 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj\" (UID: \"a3425a91-733d-43c0-b7af-42914da99374\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.910433 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj\" (UID: \"a3425a91-733d-43c0-b7af-42914da99374\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.915935 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj\" (UID: \"a3425a91-733d-43c0-b7af-42914da99374\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.916038 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj\" (UID: \"a3425a91-733d-43c0-b7af-42914da99374\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.918722 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj\" (UID: \"a3425a91-733d-43c0-b7af-42914da99374\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.940090 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfkcp\" (UniqueName: \"kubernetes.io/projected/a3425a91-733d-43c0-b7af-42914da99374-kube-api-access-zfkcp\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj\" (UID: \"a3425a91-733d-43c0-b7af-42914da99374\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" Oct 05 20:57:07 crc kubenswrapper[4753]: I1005 20:57:07.976056 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" Oct 05 20:57:08 crc kubenswrapper[4753]: I1005 20:57:08.558046 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj"] Oct 05 20:57:09 crc kubenswrapper[4753]: I1005 20:57:09.564033 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" event={"ID":"a3425a91-733d-43c0-b7af-42914da99374","Type":"ContainerStarted","Data":"4c50c5bb2325a613304d9be4e7813744e7e191860f5d7562f4dc7070679a433b"} Oct 05 20:57:09 crc kubenswrapper[4753]: I1005 20:57:09.564342 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" event={"ID":"a3425a91-733d-43c0-b7af-42914da99374","Type":"ContainerStarted","Data":"955a23ba51168b164ab5ca23127c033caf1ec744aaeeca0d6c097c7e6a4255a9"} Oct 05 20:57:15 crc kubenswrapper[4753]: I1005 20:57:15.617430 4753 generic.go:334] "Generic (PLEG): container finished" podID="a3425a91-733d-43c0-b7af-42914da99374" containerID="4c50c5bb2325a613304d9be4e7813744e7e191860f5d7562f4dc7070679a433b" exitCode=0 Oct 05 20:57:15 crc kubenswrapper[4753]: I1005 20:57:15.617542 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" event={"ID":"a3425a91-733d-43c0-b7af-42914da99374","Type":"ContainerDied","Data":"4c50c5bb2325a613304d9be4e7813744e7e191860f5d7562f4dc7070679a433b"} Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.000313 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.195647 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfkcp\" (UniqueName: \"kubernetes.io/projected/a3425a91-733d-43c0-b7af-42914da99374-kube-api-access-zfkcp\") pod \"a3425a91-733d-43c0-b7af-42914da99374\" (UID: \"a3425a91-733d-43c0-b7af-42914da99374\") " Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.195760 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-inventory\") pod \"a3425a91-733d-43c0-b7af-42914da99374\" (UID: \"a3425a91-733d-43c0-b7af-42914da99374\") " Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.195879 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-ssh-key\") pod \"a3425a91-733d-43c0-b7af-42914da99374\" (UID: \"a3425a91-733d-43c0-b7af-42914da99374\") " Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.195907 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-ceph\") pod \"a3425a91-733d-43c0-b7af-42914da99374\" (UID: \"a3425a91-733d-43c0-b7af-42914da99374\") " Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.200976 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3425a91-733d-43c0-b7af-42914da99374-kube-api-access-zfkcp" (OuterVolumeSpecName: "kube-api-access-zfkcp") pod "a3425a91-733d-43c0-b7af-42914da99374" (UID: "a3425a91-733d-43c0-b7af-42914da99374"). InnerVolumeSpecName "kube-api-access-zfkcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.202189 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-ceph" (OuterVolumeSpecName: "ceph") pod "a3425a91-733d-43c0-b7af-42914da99374" (UID: "a3425a91-733d-43c0-b7af-42914da99374"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.222105 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-inventory" (OuterVolumeSpecName: "inventory") pod "a3425a91-733d-43c0-b7af-42914da99374" (UID: "a3425a91-733d-43c0-b7af-42914da99374"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.224462 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a3425a91-733d-43c0-b7af-42914da99374" (UID: "a3425a91-733d-43c0-b7af-42914da99374"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.297628 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.297652 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.297661 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfkcp\" (UniqueName: \"kubernetes.io/projected/a3425a91-733d-43c0-b7af-42914da99374-kube-api-access-zfkcp\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.297672 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3425a91-733d-43c0-b7af-42914da99374-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.630575 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" event={"ID":"a3425a91-733d-43c0-b7af-42914da99374","Type":"ContainerDied","Data":"955a23ba51168b164ab5ca23127c033caf1ec744aaeeca0d6c097c7e6a4255a9"} Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.630616 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="955a23ba51168b164ab5ca23127c033caf1ec744aaeeca0d6c097c7e6a4255a9" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.630671 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.793162 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49"] Oct 05 20:57:17 crc kubenswrapper[4753]: E1005 20:57:17.793522 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3425a91-733d-43c0-b7af-42914da99374" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.793540 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3425a91-733d-43c0-b7af-42914da99374" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.793717 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3425a91-733d-43c0-b7af-42914da99374" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.794327 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.797080 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.797321 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.797914 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.798111 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.798487 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.803356 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.806544 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/103811f6-8ae0-475f-878b-0c5c615265ee-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.806622 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzkfg\" (UniqueName: \"kubernetes.io/projected/103811f6-8ae0-475f-878b-0c5c615265ee-kube-api-access-gzkfg\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.806657 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.806707 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.806761 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.806829 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.817356 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49"] Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.908456 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/103811f6-8ae0-475f-878b-0c5c615265ee-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.908505 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzkfg\" (UniqueName: \"kubernetes.io/projected/103811f6-8ae0-475f-878b-0c5c615265ee-kube-api-access-gzkfg\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.908536 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.908561 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.908588 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.908633 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.920114 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/103811f6-8ae0-475f-878b-0c5c615265ee-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.922447 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.930221 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.930255 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.932156 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:17 crc kubenswrapper[4753]: I1005 20:57:17.937948 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzkfg\" (UniqueName: \"kubernetes.io/projected/103811f6-8ae0-475f-878b-0c5c615265ee-kube-api-access-gzkfg\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btv49\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:18 crc kubenswrapper[4753]: I1005 20:57:18.113778 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:57:18 crc kubenswrapper[4753]: I1005 20:57:18.665485 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49"] Oct 05 20:57:19 crc kubenswrapper[4753]: I1005 20:57:19.648308 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" event={"ID":"103811f6-8ae0-475f-878b-0c5c615265ee","Type":"ContainerStarted","Data":"4961f86b17d1b89246a309d769f850c8403b7b2414349e7801d26660cc90f55d"} Oct 05 20:57:20 crc kubenswrapper[4753]: I1005 20:57:20.657464 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" event={"ID":"103811f6-8ae0-475f-878b-0c5c615265ee","Type":"ContainerStarted","Data":"0eb6330ec8b11fcb1361fc9420061c4fe30f4f20e613362e75621009292c751e"} Oct 05 20:57:20 crc kubenswrapper[4753]: I1005 20:57:20.685759 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" podStartSLOduration=2.112945742 podStartE2EDuration="3.685737698s" podCreationTimestamp="2025-10-05 20:57:17 +0000 UTC" firstStartedPulling="2025-10-05 20:57:18.684474494 +0000 UTC m=+2547.532802746" lastFinishedPulling="2025-10-05 20:57:20.25726645 +0000 UTC m=+2549.105594702" observedRunningTime="2025-10-05 20:57:20.684073736 +0000 UTC m=+2549.532401968" watchObservedRunningTime="2025-10-05 20:57:20.685737698 +0000 UTC m=+2549.534065930" Oct 05 20:57:22 crc kubenswrapper[4753]: I1005 20:57:22.855675 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:57:22 crc kubenswrapper[4753]: E1005 20:57:22.856271 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 20:57:34 crc kubenswrapper[4753]: I1005 20:57:34.853460 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 20:57:35 crc kubenswrapper[4753]: I1005 20:57:35.817806 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"f370bebddf0b48b3eae16a7acc2526579ae4e4e93d5827de3123477139f20060"} Oct 05 20:58:46 crc kubenswrapper[4753]: I1005 20:58:46.412289 4753 generic.go:334] "Generic (PLEG): container finished" podID="103811f6-8ae0-475f-878b-0c5c615265ee" containerID="0eb6330ec8b11fcb1361fc9420061c4fe30f4f20e613362e75621009292c751e" exitCode=0 Oct 05 20:58:46 crc kubenswrapper[4753]: I1005 20:58:46.412375 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" event={"ID":"103811f6-8ae0-475f-878b-0c5c615265ee","Type":"ContainerDied","Data":"0eb6330ec8b11fcb1361fc9420061c4fe30f4f20e613362e75621009292c751e"} Oct 05 20:58:47 crc kubenswrapper[4753]: I1005 20:58:47.847958 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:58:47 crc kubenswrapper[4753]: I1005 20:58:47.992924 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/103811f6-8ae0-475f-878b-0c5c615265ee-ovncontroller-config-0\") pod \"103811f6-8ae0-475f-878b-0c5c615265ee\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " Oct 05 20:58:47 crc kubenswrapper[4753]: I1005 20:58:47.993034 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ovn-combined-ca-bundle\") pod \"103811f6-8ae0-475f-878b-0c5c615265ee\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " Oct 05 20:58:47 crc kubenswrapper[4753]: I1005 20:58:47.993079 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ssh-key\") pod \"103811f6-8ae0-475f-878b-0c5c615265ee\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " Oct 05 20:58:47 crc kubenswrapper[4753]: I1005 20:58:47.993111 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-inventory\") pod \"103811f6-8ae0-475f-878b-0c5c615265ee\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " Oct 05 20:58:47 crc kubenswrapper[4753]: I1005 20:58:47.993189 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ceph\") pod \"103811f6-8ae0-475f-878b-0c5c615265ee\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " Oct 05 20:58:47 crc kubenswrapper[4753]: I1005 20:58:47.993284 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzkfg\" (UniqueName: \"kubernetes.io/projected/103811f6-8ae0-475f-878b-0c5c615265ee-kube-api-access-gzkfg\") pod \"103811f6-8ae0-475f-878b-0c5c615265ee\" (UID: \"103811f6-8ae0-475f-878b-0c5c615265ee\") " Oct 05 20:58:47 crc kubenswrapper[4753]: I1005 20:58:47.999671 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "103811f6-8ae0-475f-878b-0c5c615265ee" (UID: "103811f6-8ae0-475f-878b-0c5c615265ee"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.000184 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/103811f6-8ae0-475f-878b-0c5c615265ee-kube-api-access-gzkfg" (OuterVolumeSpecName: "kube-api-access-gzkfg") pod "103811f6-8ae0-475f-878b-0c5c615265ee" (UID: "103811f6-8ae0-475f-878b-0c5c615265ee"). InnerVolumeSpecName "kube-api-access-gzkfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.000267 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ceph" (OuterVolumeSpecName: "ceph") pod "103811f6-8ae0-475f-878b-0c5c615265ee" (UID: "103811f6-8ae0-475f-878b-0c5c615265ee"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.024187 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "103811f6-8ae0-475f-878b-0c5c615265ee" (UID: "103811f6-8ae0-475f-878b-0c5c615265ee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.032079 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/103811f6-8ae0-475f-878b-0c5c615265ee-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "103811f6-8ae0-475f-878b-0c5c615265ee" (UID: "103811f6-8ae0-475f-878b-0c5c615265ee"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.048549 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-inventory" (OuterVolumeSpecName: "inventory") pod "103811f6-8ae0-475f-878b-0c5c615265ee" (UID: "103811f6-8ae0-475f-878b-0c5c615265ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.095092 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.095125 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzkfg\" (UniqueName: \"kubernetes.io/projected/103811f6-8ae0-475f-878b-0c5c615265ee-kube-api-access-gzkfg\") on node \"crc\" DevicePath \"\"" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.095158 4753 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/103811f6-8ae0-475f-878b-0c5c615265ee-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.095167 4753 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.095175 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.095183 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/103811f6-8ae0-475f-878b-0c5c615265ee-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.438087 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" event={"ID":"103811f6-8ae0-475f-878b-0c5c615265ee","Type":"ContainerDied","Data":"4961f86b17d1b89246a309d769f850c8403b7b2414349e7801d26660cc90f55d"} Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.438513 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4961f86b17d1b89246a309d769f850c8403b7b2414349e7801d26660cc90f55d" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.438344 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btv49" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.596821 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q"] Oct 05 20:58:48 crc kubenswrapper[4753]: E1005 20:58:48.597551 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103811f6-8ae0-475f-878b-0c5c615265ee" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.597664 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="103811f6-8ae0-475f-878b-0c5c615265ee" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.598036 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="103811f6-8ae0-475f-878b-0c5c615265ee" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.598932 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.601604 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.603198 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.603602 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.603870 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.604130 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.604397 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.604712 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.619292 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q"] Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.707472 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.707552 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.707585 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.707614 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.707655 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9lbn\" (UniqueName: \"kubernetes.io/projected/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-kube-api-access-c9lbn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.707796 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.707828 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.809861 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.809916 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.809959 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.809984 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.810014 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.810054 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9lbn\" (UniqueName: \"kubernetes.io/projected/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-kube-api-access-c9lbn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.810205 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.814468 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.815274 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.816710 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.817276 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.817829 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.819885 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.827086 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9lbn\" (UniqueName: \"kubernetes.io/projected/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-kube-api-access-c9lbn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.918069 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6fzth"] Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.919931 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.920755 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:58:48 crc kubenswrapper[4753]: I1005 20:58:48.986432 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fzth"] Oct 05 20:58:49 crc kubenswrapper[4753]: I1005 20:58:49.115430 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qw2j\" (UniqueName: \"kubernetes.io/projected/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-kube-api-access-5qw2j\") pod \"redhat-marketplace-6fzth\" (UID: \"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd\") " pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:58:49 crc kubenswrapper[4753]: I1005 20:58:49.115827 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-utilities\") pod \"redhat-marketplace-6fzth\" (UID: \"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd\") " pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:58:49 crc kubenswrapper[4753]: I1005 20:58:49.115864 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-catalog-content\") pod \"redhat-marketplace-6fzth\" (UID: \"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd\") " pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:58:49 crc kubenswrapper[4753]: I1005 20:58:49.217448 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qw2j\" (UniqueName: \"kubernetes.io/projected/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-kube-api-access-5qw2j\") pod \"redhat-marketplace-6fzth\" (UID: \"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd\") " pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:58:49 crc kubenswrapper[4753]: I1005 20:58:49.217515 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-utilities\") pod \"redhat-marketplace-6fzth\" (UID: \"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd\") " pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:58:49 crc kubenswrapper[4753]: I1005 20:58:49.217541 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-catalog-content\") pod \"redhat-marketplace-6fzth\" (UID: \"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd\") " pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:58:49 crc kubenswrapper[4753]: I1005 20:58:49.218015 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-catalog-content\") pod \"redhat-marketplace-6fzth\" (UID: \"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd\") " pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:58:49 crc kubenswrapper[4753]: I1005 20:58:49.218892 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-utilities\") pod \"redhat-marketplace-6fzth\" (UID: \"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd\") " pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:58:49 crc kubenswrapper[4753]: I1005 20:58:49.239841 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qw2j\" (UniqueName: \"kubernetes.io/projected/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-kube-api-access-5qw2j\") pod \"redhat-marketplace-6fzth\" (UID: \"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd\") " pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:58:49 crc kubenswrapper[4753]: I1005 20:58:49.248555 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:58:49 crc kubenswrapper[4753]: I1005 20:58:49.505188 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q"] Oct 05 20:58:49 crc kubenswrapper[4753]: I1005 20:58:49.690237 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fzth"] Oct 05 20:58:49 crc kubenswrapper[4753]: W1005 20:58:49.692293 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5fc34cb_33b1_4b3d_ba2a_e2e9f04745dd.slice/crio-bf7d5caf7980afc1a48979482a2c83f9449b7a84f35dd8fde3332d6c418c0402 WatchSource:0}: Error finding container bf7d5caf7980afc1a48979482a2c83f9449b7a84f35dd8fde3332d6c418c0402: Status 404 returned error can't find the container with id bf7d5caf7980afc1a48979482a2c83f9449b7a84f35dd8fde3332d6c418c0402 Oct 05 20:58:50 crc kubenswrapper[4753]: I1005 20:58:50.461081 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" event={"ID":"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85","Type":"ContainerStarted","Data":"e0156374cee2513045031443eceb6c3a2bc19d20de4a7d33cfef7b8b25344c92"} Oct 05 20:58:50 crc kubenswrapper[4753]: I1005 20:58:50.461462 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" event={"ID":"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85","Type":"ContainerStarted","Data":"a6e287eaf02cd8e3ab68419c27e66d7357c913b0c5239d8da11a48ee3b029e76"} Oct 05 20:58:50 crc kubenswrapper[4753]: I1005 20:58:50.464249 4753 generic.go:334] "Generic (PLEG): container finished" podID="b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd" containerID="b5c1db3b65bf20283b64d739554c6c88461765ae828b1906fa3fdaa70b9b8a5f" exitCode=0 Oct 05 20:58:50 crc kubenswrapper[4753]: I1005 20:58:50.464290 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fzth" event={"ID":"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd","Type":"ContainerDied","Data":"b5c1db3b65bf20283b64d739554c6c88461765ae828b1906fa3fdaa70b9b8a5f"} Oct 05 20:58:50 crc kubenswrapper[4753]: I1005 20:58:50.464316 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fzth" event={"ID":"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd","Type":"ContainerStarted","Data":"bf7d5caf7980afc1a48979482a2c83f9449b7a84f35dd8fde3332d6c418c0402"} Oct 05 20:58:50 crc kubenswrapper[4753]: I1005 20:58:50.490035 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" podStartSLOduration=1.99988611 podStartE2EDuration="2.490012469s" podCreationTimestamp="2025-10-05 20:58:48 +0000 UTC" firstStartedPulling="2025-10-05 20:58:49.516459398 +0000 UTC m=+2638.364787630" lastFinishedPulling="2025-10-05 20:58:50.006585757 +0000 UTC m=+2638.854913989" observedRunningTime="2025-10-05 20:58:50.482019654 +0000 UTC m=+2639.330347886" watchObservedRunningTime="2025-10-05 20:58:50.490012469 +0000 UTC m=+2639.338340691" Oct 05 20:58:51 crc kubenswrapper[4753]: I1005 20:58:51.475121 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fzth" event={"ID":"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd","Type":"ContainerStarted","Data":"b85a8a9d77272bcac73dbd3d0d57b480d214d688cda353dbaecc980c62f3d6ef"} Oct 05 20:58:52 crc kubenswrapper[4753]: I1005 20:58:52.485015 4753 generic.go:334] "Generic (PLEG): container finished" podID="b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd" containerID="b85a8a9d77272bcac73dbd3d0d57b480d214d688cda353dbaecc980c62f3d6ef" exitCode=0 Oct 05 20:58:52 crc kubenswrapper[4753]: I1005 20:58:52.485396 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fzth" event={"ID":"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd","Type":"ContainerDied","Data":"b85a8a9d77272bcac73dbd3d0d57b480d214d688cda353dbaecc980c62f3d6ef"} Oct 05 20:58:53 crc kubenswrapper[4753]: I1005 20:58:53.498975 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fzth" event={"ID":"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd","Type":"ContainerStarted","Data":"58d387d4ab713b533c6582cef10aebd7aa783107a3e9ecd3cd020487b31b8f39"} Oct 05 20:58:53 crc kubenswrapper[4753]: I1005 20:58:53.521487 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6fzth" podStartSLOduration=3.127480409 podStartE2EDuration="5.521472944s" podCreationTimestamp="2025-10-05 20:58:48 +0000 UTC" firstStartedPulling="2025-10-05 20:58:50.465939122 +0000 UTC m=+2639.314267354" lastFinishedPulling="2025-10-05 20:58:52.859931657 +0000 UTC m=+2641.708259889" observedRunningTime="2025-10-05 20:58:53.514969715 +0000 UTC m=+2642.363297947" watchObservedRunningTime="2025-10-05 20:58:53.521472944 +0000 UTC m=+2642.369801176" Oct 05 20:58:59 crc kubenswrapper[4753]: I1005 20:58:59.249155 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:58:59 crc kubenswrapper[4753]: I1005 20:58:59.249813 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:58:59 crc kubenswrapper[4753]: I1005 20:58:59.307995 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:58:59 crc kubenswrapper[4753]: I1005 20:58:59.647370 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:58:59 crc kubenswrapper[4753]: I1005 20:58:59.699467 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fzth"] Oct 05 20:59:01 crc kubenswrapper[4753]: I1005 20:59:01.605652 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6fzth" podUID="b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd" containerName="registry-server" containerID="cri-o://58d387d4ab713b533c6582cef10aebd7aa783107a3e9ecd3cd020487b31b8f39" gracePeriod=2 Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.139447 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.194892 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qw2j\" (UniqueName: \"kubernetes.io/projected/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-kube-api-access-5qw2j\") pod \"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd\" (UID: \"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd\") " Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.194930 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-utilities\") pod \"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd\" (UID: \"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd\") " Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.195082 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-catalog-content\") pod \"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd\" (UID: \"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd\") " Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.198207 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-utilities" (OuterVolumeSpecName: "utilities") pod "b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd" (UID: "b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.205546 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-kube-api-access-5qw2j" (OuterVolumeSpecName: "kube-api-access-5qw2j") pod "b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd" (UID: "b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd"). InnerVolumeSpecName "kube-api-access-5qw2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.216700 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd" (UID: "b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.296932 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qw2j\" (UniqueName: \"kubernetes.io/projected/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-kube-api-access-5qw2j\") on node \"crc\" DevicePath \"\"" Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.296969 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.296979 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.616772 4753 generic.go:334] "Generic (PLEG): container finished" podID="b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd" containerID="58d387d4ab713b533c6582cef10aebd7aa783107a3e9ecd3cd020487b31b8f39" exitCode=0 Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.616813 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fzth" event={"ID":"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd","Type":"ContainerDied","Data":"58d387d4ab713b533c6582cef10aebd7aa783107a3e9ecd3cd020487b31b8f39"} Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.616837 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fzth" event={"ID":"b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd","Type":"ContainerDied","Data":"bf7d5caf7980afc1a48979482a2c83f9449b7a84f35dd8fde3332d6c418c0402"} Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.616852 4753 scope.go:117] "RemoveContainer" containerID="58d387d4ab713b533c6582cef10aebd7aa783107a3e9ecd3cd020487b31b8f39" Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.616961 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fzth" Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.656236 4753 scope.go:117] "RemoveContainer" containerID="b85a8a9d77272bcac73dbd3d0d57b480d214d688cda353dbaecc980c62f3d6ef" Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.664745 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fzth"] Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.673912 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fzth"] Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.698617 4753 scope.go:117] "RemoveContainer" containerID="b5c1db3b65bf20283b64d739554c6c88461765ae828b1906fa3fdaa70b9b8a5f" Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.728033 4753 scope.go:117] "RemoveContainer" containerID="58d387d4ab713b533c6582cef10aebd7aa783107a3e9ecd3cd020487b31b8f39" Oct 05 20:59:02 crc kubenswrapper[4753]: E1005 20:59:02.728629 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d387d4ab713b533c6582cef10aebd7aa783107a3e9ecd3cd020487b31b8f39\": container with ID starting with 58d387d4ab713b533c6582cef10aebd7aa783107a3e9ecd3cd020487b31b8f39 not found: ID does not exist" containerID="58d387d4ab713b533c6582cef10aebd7aa783107a3e9ecd3cd020487b31b8f39" Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.728680 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d387d4ab713b533c6582cef10aebd7aa783107a3e9ecd3cd020487b31b8f39"} err="failed to get container status \"58d387d4ab713b533c6582cef10aebd7aa783107a3e9ecd3cd020487b31b8f39\": rpc error: code = NotFound desc = could not find container \"58d387d4ab713b533c6582cef10aebd7aa783107a3e9ecd3cd020487b31b8f39\": container with ID starting with 58d387d4ab713b533c6582cef10aebd7aa783107a3e9ecd3cd020487b31b8f39 not found: ID does not exist" Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.728714 4753 scope.go:117] "RemoveContainer" containerID="b85a8a9d77272bcac73dbd3d0d57b480d214d688cda353dbaecc980c62f3d6ef" Oct 05 20:59:02 crc kubenswrapper[4753]: E1005 20:59:02.728977 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b85a8a9d77272bcac73dbd3d0d57b480d214d688cda353dbaecc980c62f3d6ef\": container with ID starting with b85a8a9d77272bcac73dbd3d0d57b480d214d688cda353dbaecc980c62f3d6ef not found: ID does not exist" containerID="b85a8a9d77272bcac73dbd3d0d57b480d214d688cda353dbaecc980c62f3d6ef" Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.729011 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b85a8a9d77272bcac73dbd3d0d57b480d214d688cda353dbaecc980c62f3d6ef"} err="failed to get container status \"b85a8a9d77272bcac73dbd3d0d57b480d214d688cda353dbaecc980c62f3d6ef\": rpc error: code = NotFound desc = could not find container \"b85a8a9d77272bcac73dbd3d0d57b480d214d688cda353dbaecc980c62f3d6ef\": container with ID starting with b85a8a9d77272bcac73dbd3d0d57b480d214d688cda353dbaecc980c62f3d6ef not found: ID does not exist" Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.729036 4753 scope.go:117] "RemoveContainer" containerID="b5c1db3b65bf20283b64d739554c6c88461765ae828b1906fa3fdaa70b9b8a5f" Oct 05 20:59:02 crc kubenswrapper[4753]: E1005 20:59:02.729602 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5c1db3b65bf20283b64d739554c6c88461765ae828b1906fa3fdaa70b9b8a5f\": container with ID starting with b5c1db3b65bf20283b64d739554c6c88461765ae828b1906fa3fdaa70b9b8a5f not found: ID does not exist" containerID="b5c1db3b65bf20283b64d739554c6c88461765ae828b1906fa3fdaa70b9b8a5f" Oct 05 20:59:02 crc kubenswrapper[4753]: I1005 20:59:02.729643 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c1db3b65bf20283b64d739554c6c88461765ae828b1906fa3fdaa70b9b8a5f"} err="failed to get container status \"b5c1db3b65bf20283b64d739554c6c88461765ae828b1906fa3fdaa70b9b8a5f\": rpc error: code = NotFound desc = could not find container \"b5c1db3b65bf20283b64d739554c6c88461765ae828b1906fa3fdaa70b9b8a5f\": container with ID starting with b5c1db3b65bf20283b64d739554c6c88461765ae828b1906fa3fdaa70b9b8a5f not found: ID does not exist" Oct 05 20:59:03 crc kubenswrapper[4753]: I1005 20:59:03.861666 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd" path="/var/lib/kubelet/pods/b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd/volumes" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.088280 4753 generic.go:334] "Generic (PLEG): container finished" podID="e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85" containerID="e0156374cee2513045031443eceb6c3a2bc19d20de4a7d33cfef7b8b25344c92" exitCode=0 Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.088500 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" event={"ID":"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85","Type":"ContainerDied","Data":"e0156374cee2513045031443eceb6c3a2bc19d20de4a7d33cfef7b8b25344c92"} Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.156584 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz"] Oct 05 21:00:00 crc kubenswrapper[4753]: E1005 21:00:00.157193 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd" containerName="registry-server" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.157225 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd" containerName="registry-server" Oct 05 21:00:00 crc kubenswrapper[4753]: E1005 21:00:00.157252 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd" containerName="extract-utilities" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.157264 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd" containerName="extract-utilities" Oct 05 21:00:00 crc kubenswrapper[4753]: E1005 21:00:00.157288 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd" containerName="extract-content" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.157300 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd" containerName="extract-content" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.157591 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5fc34cb-33b1-4b3d-ba2a-e2e9f04745dd" containerName="registry-server" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.158490 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.161113 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.161306 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.169448 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz"] Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.296952 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9gnl\" (UniqueName: \"kubernetes.io/projected/28fa4c11-464a-42da-869b-a8d9739448a0-kube-api-access-x9gnl\") pod \"collect-profiles-29328300-9m8pz\" (UID: \"28fa4c11-464a-42da-869b-a8d9739448a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.297033 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28fa4c11-464a-42da-869b-a8d9739448a0-config-volume\") pod \"collect-profiles-29328300-9m8pz\" (UID: \"28fa4c11-464a-42da-869b-a8d9739448a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.297269 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28fa4c11-464a-42da-869b-a8d9739448a0-secret-volume\") pod \"collect-profiles-29328300-9m8pz\" (UID: \"28fa4c11-464a-42da-869b-a8d9739448a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.398555 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28fa4c11-464a-42da-869b-a8d9739448a0-secret-volume\") pod \"collect-profiles-29328300-9m8pz\" (UID: \"28fa4c11-464a-42da-869b-a8d9739448a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.398671 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9gnl\" (UniqueName: \"kubernetes.io/projected/28fa4c11-464a-42da-869b-a8d9739448a0-kube-api-access-x9gnl\") pod \"collect-profiles-29328300-9m8pz\" (UID: \"28fa4c11-464a-42da-869b-a8d9739448a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.398768 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28fa4c11-464a-42da-869b-a8d9739448a0-config-volume\") pod \"collect-profiles-29328300-9m8pz\" (UID: \"28fa4c11-464a-42da-869b-a8d9739448a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.399583 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28fa4c11-464a-42da-869b-a8d9739448a0-config-volume\") pod \"collect-profiles-29328300-9m8pz\" (UID: \"28fa4c11-464a-42da-869b-a8d9739448a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.404830 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28fa4c11-464a-42da-869b-a8d9739448a0-secret-volume\") pod \"collect-profiles-29328300-9m8pz\" (UID: \"28fa4c11-464a-42da-869b-a8d9739448a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.416802 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9gnl\" (UniqueName: \"kubernetes.io/projected/28fa4c11-464a-42da-869b-a8d9739448a0-kube-api-access-x9gnl\") pod \"collect-profiles-29328300-9m8pz\" (UID: \"28fa4c11-464a-42da-869b-a8d9739448a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.477847 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz" Oct 05 21:00:00 crc kubenswrapper[4753]: I1005 21:00:00.954846 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz"] Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.101770 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz" event={"ID":"28fa4c11-464a-42da-869b-a8d9739448a0","Type":"ContainerStarted","Data":"2c62ac10856430624e3328ea5b8048467ed79a6aac332ca5b3b6a5cf2209bcd3"} Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.519424 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.617735 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-neutron-ovn-metadata-agent-neutron-config-0\") pod \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.617811 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9lbn\" (UniqueName: \"kubernetes.io/projected/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-kube-api-access-c9lbn\") pod \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.617858 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-nova-metadata-neutron-config-0\") pod \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.617908 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-ceph\") pod \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.617934 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-neutron-metadata-combined-ca-bundle\") pod \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.617967 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-ssh-key\") pod \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.617982 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-inventory\") pod \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\" (UID: \"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85\") " Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.624398 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-ceph" (OuterVolumeSpecName: "ceph") pod "e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85" (UID: "e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.624758 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-kube-api-access-c9lbn" (OuterVolumeSpecName: "kube-api-access-c9lbn") pod "e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85" (UID: "e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85"). InnerVolumeSpecName "kube-api-access-c9lbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.624770 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85" (UID: "e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.643548 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85" (UID: "e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.646675 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-inventory" (OuterVolumeSpecName: "inventory") pod "e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85" (UID: "e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.651021 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85" (UID: "e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.651220 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85" (UID: "e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.720552 4753 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.720584 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9lbn\" (UniqueName: \"kubernetes.io/projected/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-kube-api-access-c9lbn\") on node \"crc\" DevicePath \"\"" Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.720596 4753 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.720608 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.720620 4753 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.720630 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 21:00:01 crc kubenswrapper[4753]: I1005 21:00:01.720722 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.113322 4753 generic.go:334] "Generic (PLEG): container finished" podID="28fa4c11-464a-42da-869b-a8d9739448a0" containerID="2c402487be5937385b945122e82dea7edda835d83e17e15b349c0a1a5aa9066f" exitCode=0 Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.113417 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz" event={"ID":"28fa4c11-464a-42da-869b-a8d9739448a0","Type":"ContainerDied","Data":"2c402487be5937385b945122e82dea7edda835d83e17e15b349c0a1a5aa9066f"} Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.120237 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" event={"ID":"e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85","Type":"ContainerDied","Data":"a6e287eaf02cd8e3ab68419c27e66d7357c913b0c5239d8da11a48ee3b029e76"} Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.120268 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6e287eaf02cd8e3ab68419c27e66d7357c913b0c5239d8da11a48ee3b029e76" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.120431 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.211044 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62"] Oct 05 21:00:02 crc kubenswrapper[4753]: E1005 21:00:02.211508 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.211534 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.211762 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.212481 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.214405 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.215607 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.215740 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.215914 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.216028 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.216130 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.220556 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62"] Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.331178 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.331235 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knqv5\" (UniqueName: \"kubernetes.io/projected/9f393cda-bc70-44d4-a534-a72b71dcf0b7-kube-api-access-knqv5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.331269 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.331296 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.331456 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.331575 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.433577 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.433645 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knqv5\" (UniqueName: \"kubernetes.io/projected/9f393cda-bc70-44d4-a534-a72b71dcf0b7-kube-api-access-knqv5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.433695 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.433731 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.433773 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.433810 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.438231 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.439302 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.439928 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.441166 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.445857 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.457877 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knqv5\" (UniqueName: \"kubernetes.io/projected/9f393cda-bc70-44d4-a534-a72b71dcf0b7-kube-api-access-knqv5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrg62\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:02 crc kubenswrapper[4753]: I1005 21:00:02.551114 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:00:03 crc kubenswrapper[4753]: I1005 21:00:03.101934 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62"] Oct 05 21:00:03 crc kubenswrapper[4753]: W1005 21:00:03.105224 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f393cda_bc70_44d4_a534_a72b71dcf0b7.slice/crio-880ba9eaf6af9e5780df90ed80d6c53430b03efb3868cdeeb2a443e0f3ee7445 WatchSource:0}: Error finding container 880ba9eaf6af9e5780df90ed80d6c53430b03efb3868cdeeb2a443e0f3ee7445: Status 404 returned error can't find the container with id 880ba9eaf6af9e5780df90ed80d6c53430b03efb3868cdeeb2a443e0f3ee7445 Oct 05 21:00:03 crc kubenswrapper[4753]: I1005 21:00:03.113735 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 05 21:00:03 crc kubenswrapper[4753]: I1005 21:00:03.131066 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" event={"ID":"9f393cda-bc70-44d4-a534-a72b71dcf0b7","Type":"ContainerStarted","Data":"880ba9eaf6af9e5780df90ed80d6c53430b03efb3868cdeeb2a443e0f3ee7445"} Oct 05 21:00:03 crc kubenswrapper[4753]: I1005 21:00:03.341749 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz" Oct 05 21:00:03 crc kubenswrapper[4753]: I1005 21:00:03.451100 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28fa4c11-464a-42da-869b-a8d9739448a0-config-volume\") pod \"28fa4c11-464a-42da-869b-a8d9739448a0\" (UID: \"28fa4c11-464a-42da-869b-a8d9739448a0\") " Oct 05 21:00:03 crc kubenswrapper[4753]: I1005 21:00:03.451288 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28fa4c11-464a-42da-869b-a8d9739448a0-secret-volume\") pod \"28fa4c11-464a-42da-869b-a8d9739448a0\" (UID: \"28fa4c11-464a-42da-869b-a8d9739448a0\") " Oct 05 21:00:03 crc kubenswrapper[4753]: I1005 21:00:03.451332 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9gnl\" (UniqueName: \"kubernetes.io/projected/28fa4c11-464a-42da-869b-a8d9739448a0-kube-api-access-x9gnl\") pod \"28fa4c11-464a-42da-869b-a8d9739448a0\" (UID: \"28fa4c11-464a-42da-869b-a8d9739448a0\") " Oct 05 21:00:03 crc kubenswrapper[4753]: I1005 21:00:03.452858 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28fa4c11-464a-42da-869b-a8d9739448a0-config-volume" (OuterVolumeSpecName: "config-volume") pod "28fa4c11-464a-42da-869b-a8d9739448a0" (UID: "28fa4c11-464a-42da-869b-a8d9739448a0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:00:03 crc kubenswrapper[4753]: I1005 21:00:03.457527 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28fa4c11-464a-42da-869b-a8d9739448a0-kube-api-access-x9gnl" (OuterVolumeSpecName: "kube-api-access-x9gnl") pod "28fa4c11-464a-42da-869b-a8d9739448a0" (UID: "28fa4c11-464a-42da-869b-a8d9739448a0"). InnerVolumeSpecName "kube-api-access-x9gnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:00:03 crc kubenswrapper[4753]: I1005 21:00:03.459306 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28fa4c11-464a-42da-869b-a8d9739448a0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "28fa4c11-464a-42da-869b-a8d9739448a0" (UID: "28fa4c11-464a-42da-869b-a8d9739448a0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:00:03 crc kubenswrapper[4753]: I1005 21:00:03.553814 4753 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28fa4c11-464a-42da-869b-a8d9739448a0-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 05 21:00:03 crc kubenswrapper[4753]: I1005 21:00:03.553854 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9gnl\" (UniqueName: \"kubernetes.io/projected/28fa4c11-464a-42da-869b-a8d9739448a0-kube-api-access-x9gnl\") on node \"crc\" DevicePath \"\"" Oct 05 21:00:03 crc kubenswrapper[4753]: I1005 21:00:03.553865 4753 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28fa4c11-464a-42da-869b-a8d9739448a0-config-volume\") on node \"crc\" DevicePath \"\"" Oct 05 21:00:04 crc kubenswrapper[4753]: I1005 21:00:04.141604 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz" event={"ID":"28fa4c11-464a-42da-869b-a8d9739448a0","Type":"ContainerDied","Data":"2c62ac10856430624e3328ea5b8048467ed79a6aac332ca5b3b6a5cf2209bcd3"} Oct 05 21:00:04 crc kubenswrapper[4753]: I1005 21:00:04.141677 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328300-9m8pz" Oct 05 21:00:04 crc kubenswrapper[4753]: I1005 21:00:04.141683 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c62ac10856430624e3328ea5b8048467ed79a6aac332ca5b3b6a5cf2209bcd3" Oct 05 21:00:04 crc kubenswrapper[4753]: I1005 21:00:04.143850 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" event={"ID":"9f393cda-bc70-44d4-a534-a72b71dcf0b7","Type":"ContainerStarted","Data":"e0f58d99465c2d2291b066e8402a2af224474499d2dacec991eed7e7fe2b4f93"} Oct 05 21:00:04 crc kubenswrapper[4753]: I1005 21:00:04.372376 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" podStartSLOduration=1.7407951320000001 podStartE2EDuration="2.37235654s" podCreationTimestamp="2025-10-05 21:00:02 +0000 UTC" firstStartedPulling="2025-10-05 21:00:03.113550775 +0000 UTC m=+2711.961879007" lastFinishedPulling="2025-10-05 21:00:03.745112183 +0000 UTC m=+2712.593440415" observedRunningTime="2025-10-05 21:00:04.164479485 +0000 UTC m=+2713.012807727" watchObservedRunningTime="2025-10-05 21:00:04.37235654 +0000 UTC m=+2713.220684772" Oct 05 21:00:04 crc kubenswrapper[4753]: I1005 21:00:04.412807 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm"] Oct 05 21:00:04 crc kubenswrapper[4753]: I1005 21:00:04.418895 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328255-xfjfm"] Oct 05 21:00:04 crc kubenswrapper[4753]: I1005 21:00:04.489951 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:00:04 crc kubenswrapper[4753]: I1005 21:00:04.490007 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:00:05 crc kubenswrapper[4753]: I1005 21:00:05.871942 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c622e2-78c9-42bd-9031-776cded4435c" path="/var/lib/kubelet/pods/10c622e2-78c9-42bd-9031-776cded4435c/volumes" Oct 05 21:00:17 crc kubenswrapper[4753]: I1005 21:00:17.749341 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p457z"] Oct 05 21:00:17 crc kubenswrapper[4753]: E1005 21:00:17.750732 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28fa4c11-464a-42da-869b-a8d9739448a0" containerName="collect-profiles" Oct 05 21:00:17 crc kubenswrapper[4753]: I1005 21:00:17.750751 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="28fa4c11-464a-42da-869b-a8d9739448a0" containerName="collect-profiles" Oct 05 21:00:17 crc kubenswrapper[4753]: I1005 21:00:17.750985 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="28fa4c11-464a-42da-869b-a8d9739448a0" containerName="collect-profiles" Oct 05 21:00:17 crc kubenswrapper[4753]: I1005 21:00:17.754678 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:17 crc kubenswrapper[4753]: I1005 21:00:17.763468 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p457z"] Oct 05 21:00:17 crc kubenswrapper[4753]: I1005 21:00:17.821551 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63932ea7-25f8-487a-bfd1-43e1963f4a54-catalog-content\") pod \"redhat-operators-p457z\" (UID: \"63932ea7-25f8-487a-bfd1-43e1963f4a54\") " pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:17 crc kubenswrapper[4753]: I1005 21:00:17.821599 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tznqp\" (UniqueName: \"kubernetes.io/projected/63932ea7-25f8-487a-bfd1-43e1963f4a54-kube-api-access-tznqp\") pod \"redhat-operators-p457z\" (UID: \"63932ea7-25f8-487a-bfd1-43e1963f4a54\") " pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:17 crc kubenswrapper[4753]: I1005 21:00:17.821697 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63932ea7-25f8-487a-bfd1-43e1963f4a54-utilities\") pod \"redhat-operators-p457z\" (UID: \"63932ea7-25f8-487a-bfd1-43e1963f4a54\") " pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:17 crc kubenswrapper[4753]: I1005 21:00:17.923227 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63932ea7-25f8-487a-bfd1-43e1963f4a54-catalog-content\") pod \"redhat-operators-p457z\" (UID: \"63932ea7-25f8-487a-bfd1-43e1963f4a54\") " pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:17 crc kubenswrapper[4753]: I1005 21:00:17.923832 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63932ea7-25f8-487a-bfd1-43e1963f4a54-catalog-content\") pod \"redhat-operators-p457z\" (UID: \"63932ea7-25f8-487a-bfd1-43e1963f4a54\") " pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:17 crc kubenswrapper[4753]: I1005 21:00:17.923869 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tznqp\" (UniqueName: \"kubernetes.io/projected/63932ea7-25f8-487a-bfd1-43e1963f4a54-kube-api-access-tznqp\") pod \"redhat-operators-p457z\" (UID: \"63932ea7-25f8-487a-bfd1-43e1963f4a54\") " pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:17 crc kubenswrapper[4753]: I1005 21:00:17.924010 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63932ea7-25f8-487a-bfd1-43e1963f4a54-utilities\") pod \"redhat-operators-p457z\" (UID: \"63932ea7-25f8-487a-bfd1-43e1963f4a54\") " pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:17 crc kubenswrapper[4753]: I1005 21:00:17.924941 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63932ea7-25f8-487a-bfd1-43e1963f4a54-utilities\") pod \"redhat-operators-p457z\" (UID: \"63932ea7-25f8-487a-bfd1-43e1963f4a54\") " pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:17 crc kubenswrapper[4753]: I1005 21:00:17.947114 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tznqp\" (UniqueName: \"kubernetes.io/projected/63932ea7-25f8-487a-bfd1-43e1963f4a54-kube-api-access-tznqp\") pod \"redhat-operators-p457z\" (UID: \"63932ea7-25f8-487a-bfd1-43e1963f4a54\") " pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:18 crc kubenswrapper[4753]: I1005 21:00:18.070883 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:18 crc kubenswrapper[4753]: I1005 21:00:18.561129 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p457z"] Oct 05 21:00:19 crc kubenswrapper[4753]: I1005 21:00:19.273036 4753 generic.go:334] "Generic (PLEG): container finished" podID="63932ea7-25f8-487a-bfd1-43e1963f4a54" containerID="c51a153f4e3f68a46974cd0ed8d7004a2a929cff93dbae611aeffce438002785" exitCode=0 Oct 05 21:00:19 crc kubenswrapper[4753]: I1005 21:00:19.273096 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p457z" event={"ID":"63932ea7-25f8-487a-bfd1-43e1963f4a54","Type":"ContainerDied","Data":"c51a153f4e3f68a46974cd0ed8d7004a2a929cff93dbae611aeffce438002785"} Oct 05 21:00:19 crc kubenswrapper[4753]: I1005 21:00:19.273506 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p457z" event={"ID":"63932ea7-25f8-487a-bfd1-43e1963f4a54","Type":"ContainerStarted","Data":"0a4b61240df82f4637ba3130b1c881d9ff633fff9c334dccb04b14fb11076984"} Oct 05 21:00:21 crc kubenswrapper[4753]: I1005 21:00:21.302428 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p457z" event={"ID":"63932ea7-25f8-487a-bfd1-43e1963f4a54","Type":"ContainerStarted","Data":"28e153bda85c0c706bf94e09f6dd7a183c7ba5a629c04da2d48b3a854c408b99"} Oct 05 21:00:24 crc kubenswrapper[4753]: I1005 21:00:24.330946 4753 generic.go:334] "Generic (PLEG): container finished" podID="63932ea7-25f8-487a-bfd1-43e1963f4a54" containerID="28e153bda85c0c706bf94e09f6dd7a183c7ba5a629c04da2d48b3a854c408b99" exitCode=0 Oct 05 21:00:24 crc kubenswrapper[4753]: I1005 21:00:24.331031 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p457z" event={"ID":"63932ea7-25f8-487a-bfd1-43e1963f4a54","Type":"ContainerDied","Data":"28e153bda85c0c706bf94e09f6dd7a183c7ba5a629c04da2d48b3a854c408b99"} Oct 05 21:00:26 crc kubenswrapper[4753]: I1005 21:00:26.350871 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p457z" event={"ID":"63932ea7-25f8-487a-bfd1-43e1963f4a54","Type":"ContainerStarted","Data":"ad2197205269e56e00d6c69835243a8cd240bcaa32692afdd7e6e963d20ee8b2"} Oct 05 21:00:26 crc kubenswrapper[4753]: I1005 21:00:26.377271 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p457z" podStartSLOduration=3.285542869 podStartE2EDuration="9.377250281s" podCreationTimestamp="2025-10-05 21:00:17 +0000 UTC" firstStartedPulling="2025-10-05 21:00:19.27676715 +0000 UTC m=+2728.125095382" lastFinishedPulling="2025-10-05 21:00:25.368474532 +0000 UTC m=+2734.216802794" observedRunningTime="2025-10-05 21:00:26.369214815 +0000 UTC m=+2735.217543057" watchObservedRunningTime="2025-10-05 21:00:26.377250281 +0000 UTC m=+2735.225578513" Oct 05 21:00:28 crc kubenswrapper[4753]: I1005 21:00:28.071044 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:28 crc kubenswrapper[4753]: I1005 21:00:28.071419 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:29 crc kubenswrapper[4753]: I1005 21:00:29.130969 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p457z" podUID="63932ea7-25f8-487a-bfd1-43e1963f4a54" containerName="registry-server" probeResult="failure" output=< Oct 05 21:00:29 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 21:00:29 crc kubenswrapper[4753]: > Oct 05 21:00:34 crc kubenswrapper[4753]: I1005 21:00:34.489972 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:00:34 crc kubenswrapper[4753]: I1005 21:00:34.490508 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:00:38 crc kubenswrapper[4753]: I1005 21:00:38.124815 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:38 crc kubenswrapper[4753]: I1005 21:00:38.185757 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:38 crc kubenswrapper[4753]: I1005 21:00:38.377759 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p457z"] Oct 05 21:00:39 crc kubenswrapper[4753]: I1005 21:00:39.465866 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p457z" podUID="63932ea7-25f8-487a-bfd1-43e1963f4a54" containerName="registry-server" containerID="cri-o://ad2197205269e56e00d6c69835243a8cd240bcaa32692afdd7e6e963d20ee8b2" gracePeriod=2 Oct 05 21:00:39 crc kubenswrapper[4753]: I1005 21:00:39.869136 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.028412 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tznqp\" (UniqueName: \"kubernetes.io/projected/63932ea7-25f8-487a-bfd1-43e1963f4a54-kube-api-access-tznqp\") pod \"63932ea7-25f8-487a-bfd1-43e1963f4a54\" (UID: \"63932ea7-25f8-487a-bfd1-43e1963f4a54\") " Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.028460 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63932ea7-25f8-487a-bfd1-43e1963f4a54-catalog-content\") pod \"63932ea7-25f8-487a-bfd1-43e1963f4a54\" (UID: \"63932ea7-25f8-487a-bfd1-43e1963f4a54\") " Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.028485 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63932ea7-25f8-487a-bfd1-43e1963f4a54-utilities\") pod \"63932ea7-25f8-487a-bfd1-43e1963f4a54\" (UID: \"63932ea7-25f8-487a-bfd1-43e1963f4a54\") " Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.029608 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63932ea7-25f8-487a-bfd1-43e1963f4a54-utilities" (OuterVolumeSpecName: "utilities") pod "63932ea7-25f8-487a-bfd1-43e1963f4a54" (UID: "63932ea7-25f8-487a-bfd1-43e1963f4a54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.037437 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63932ea7-25f8-487a-bfd1-43e1963f4a54-kube-api-access-tznqp" (OuterVolumeSpecName: "kube-api-access-tznqp") pod "63932ea7-25f8-487a-bfd1-43e1963f4a54" (UID: "63932ea7-25f8-487a-bfd1-43e1963f4a54"). InnerVolumeSpecName "kube-api-access-tznqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.126505 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63932ea7-25f8-487a-bfd1-43e1963f4a54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63932ea7-25f8-487a-bfd1-43e1963f4a54" (UID: "63932ea7-25f8-487a-bfd1-43e1963f4a54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.130851 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tznqp\" (UniqueName: \"kubernetes.io/projected/63932ea7-25f8-487a-bfd1-43e1963f4a54-kube-api-access-tznqp\") on node \"crc\" DevicePath \"\"" Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.130887 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63932ea7-25f8-487a-bfd1-43e1963f4a54-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.130897 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63932ea7-25f8-487a-bfd1-43e1963f4a54-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.475380 4753 generic.go:334] "Generic (PLEG): container finished" podID="63932ea7-25f8-487a-bfd1-43e1963f4a54" containerID="ad2197205269e56e00d6c69835243a8cd240bcaa32692afdd7e6e963d20ee8b2" exitCode=0 Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.475449 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p457z" event={"ID":"63932ea7-25f8-487a-bfd1-43e1963f4a54","Type":"ContainerDied","Data":"ad2197205269e56e00d6c69835243a8cd240bcaa32692afdd7e6e963d20ee8b2"} Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.475489 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p457z" event={"ID":"63932ea7-25f8-487a-bfd1-43e1963f4a54","Type":"ContainerDied","Data":"0a4b61240df82f4637ba3130b1c881d9ff633fff9c334dccb04b14fb11076984"} Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.475505 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p457z" Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.475510 4753 scope.go:117] "RemoveContainer" containerID="ad2197205269e56e00d6c69835243a8cd240bcaa32692afdd7e6e963d20ee8b2" Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.506515 4753 scope.go:117] "RemoveContainer" containerID="28e153bda85c0c706bf94e09f6dd7a183c7ba5a629c04da2d48b3a854c408b99" Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.515263 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p457z"] Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.522912 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p457z"] Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.539428 4753 scope.go:117] "RemoveContainer" containerID="c51a153f4e3f68a46974cd0ed8d7004a2a929cff93dbae611aeffce438002785" Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.588697 4753 scope.go:117] "RemoveContainer" containerID="ad2197205269e56e00d6c69835243a8cd240bcaa32692afdd7e6e963d20ee8b2" Oct 05 21:00:40 crc kubenswrapper[4753]: E1005 21:00:40.589381 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad2197205269e56e00d6c69835243a8cd240bcaa32692afdd7e6e963d20ee8b2\": container with ID starting with ad2197205269e56e00d6c69835243a8cd240bcaa32692afdd7e6e963d20ee8b2 not found: ID does not exist" containerID="ad2197205269e56e00d6c69835243a8cd240bcaa32692afdd7e6e963d20ee8b2" Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.589414 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad2197205269e56e00d6c69835243a8cd240bcaa32692afdd7e6e963d20ee8b2"} err="failed to get container status \"ad2197205269e56e00d6c69835243a8cd240bcaa32692afdd7e6e963d20ee8b2\": rpc error: code = NotFound desc = could not find container \"ad2197205269e56e00d6c69835243a8cd240bcaa32692afdd7e6e963d20ee8b2\": container with ID starting with ad2197205269e56e00d6c69835243a8cd240bcaa32692afdd7e6e963d20ee8b2 not found: ID does not exist" Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.589436 4753 scope.go:117] "RemoveContainer" containerID="28e153bda85c0c706bf94e09f6dd7a183c7ba5a629c04da2d48b3a854c408b99" Oct 05 21:00:40 crc kubenswrapper[4753]: E1005 21:00:40.589731 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28e153bda85c0c706bf94e09f6dd7a183c7ba5a629c04da2d48b3a854c408b99\": container with ID starting with 28e153bda85c0c706bf94e09f6dd7a183c7ba5a629c04da2d48b3a854c408b99 not found: ID does not exist" containerID="28e153bda85c0c706bf94e09f6dd7a183c7ba5a629c04da2d48b3a854c408b99" Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.589780 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28e153bda85c0c706bf94e09f6dd7a183c7ba5a629c04da2d48b3a854c408b99"} err="failed to get container status \"28e153bda85c0c706bf94e09f6dd7a183c7ba5a629c04da2d48b3a854c408b99\": rpc error: code = NotFound desc = could not find container \"28e153bda85c0c706bf94e09f6dd7a183c7ba5a629c04da2d48b3a854c408b99\": container with ID starting with 28e153bda85c0c706bf94e09f6dd7a183c7ba5a629c04da2d48b3a854c408b99 not found: ID does not exist" Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.589812 4753 scope.go:117] "RemoveContainer" containerID="c51a153f4e3f68a46974cd0ed8d7004a2a929cff93dbae611aeffce438002785" Oct 05 21:00:40 crc kubenswrapper[4753]: E1005 21:00:40.590077 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c51a153f4e3f68a46974cd0ed8d7004a2a929cff93dbae611aeffce438002785\": container with ID starting with c51a153f4e3f68a46974cd0ed8d7004a2a929cff93dbae611aeffce438002785 not found: ID does not exist" containerID="c51a153f4e3f68a46974cd0ed8d7004a2a929cff93dbae611aeffce438002785" Oct 05 21:00:40 crc kubenswrapper[4753]: I1005 21:00:40.590098 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c51a153f4e3f68a46974cd0ed8d7004a2a929cff93dbae611aeffce438002785"} err="failed to get container status \"c51a153f4e3f68a46974cd0ed8d7004a2a929cff93dbae611aeffce438002785\": rpc error: code = NotFound desc = could not find container \"c51a153f4e3f68a46974cd0ed8d7004a2a929cff93dbae611aeffce438002785\": container with ID starting with c51a153f4e3f68a46974cd0ed8d7004a2a929cff93dbae611aeffce438002785 not found: ID does not exist" Oct 05 21:00:41 crc kubenswrapper[4753]: I1005 21:00:41.916898 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63932ea7-25f8-487a-bfd1-43e1963f4a54" path="/var/lib/kubelet/pods/63932ea7-25f8-487a-bfd1-43e1963f4a54/volumes" Oct 05 21:00:58 crc kubenswrapper[4753]: I1005 21:00:58.971764 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-58jh5"] Oct 05 21:00:58 crc kubenswrapper[4753]: E1005 21:00:58.972968 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63932ea7-25f8-487a-bfd1-43e1963f4a54" containerName="registry-server" Oct 05 21:00:58 crc kubenswrapper[4753]: I1005 21:00:58.972986 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="63932ea7-25f8-487a-bfd1-43e1963f4a54" containerName="registry-server" Oct 05 21:00:58 crc kubenswrapper[4753]: E1005 21:00:58.973008 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63932ea7-25f8-487a-bfd1-43e1963f4a54" containerName="extract-utilities" Oct 05 21:00:58 crc kubenswrapper[4753]: I1005 21:00:58.973019 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="63932ea7-25f8-487a-bfd1-43e1963f4a54" containerName="extract-utilities" Oct 05 21:00:58 crc kubenswrapper[4753]: E1005 21:00:58.973066 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63932ea7-25f8-487a-bfd1-43e1963f4a54" containerName="extract-content" Oct 05 21:00:58 crc kubenswrapper[4753]: I1005 21:00:58.973075 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="63932ea7-25f8-487a-bfd1-43e1963f4a54" containerName="extract-content" Oct 05 21:00:58 crc kubenswrapper[4753]: I1005 21:00:58.973333 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="63932ea7-25f8-487a-bfd1-43e1963f4a54" containerName="registry-server" Oct 05 21:00:58 crc kubenswrapper[4753]: I1005 21:00:58.975473 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:00:59 crc kubenswrapper[4753]: I1005 21:00:59.000031 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-58jh5"] Oct 05 21:00:59 crc kubenswrapper[4753]: I1005 21:00:59.116308 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/455ac15e-ed1d-4399-85fa-9980d811f818-catalog-content\") pod \"community-operators-58jh5\" (UID: \"455ac15e-ed1d-4399-85fa-9980d811f818\") " pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:00:59 crc kubenswrapper[4753]: I1005 21:00:59.116446 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/455ac15e-ed1d-4399-85fa-9980d811f818-utilities\") pod \"community-operators-58jh5\" (UID: \"455ac15e-ed1d-4399-85fa-9980d811f818\") " pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:00:59 crc kubenswrapper[4753]: I1005 21:00:59.116530 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh6ph\" (UniqueName: \"kubernetes.io/projected/455ac15e-ed1d-4399-85fa-9980d811f818-kube-api-access-nh6ph\") pod \"community-operators-58jh5\" (UID: \"455ac15e-ed1d-4399-85fa-9980d811f818\") " pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:00:59 crc kubenswrapper[4753]: I1005 21:00:59.218744 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/455ac15e-ed1d-4399-85fa-9980d811f818-catalog-content\") pod \"community-operators-58jh5\" (UID: \"455ac15e-ed1d-4399-85fa-9980d811f818\") " pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:00:59 crc kubenswrapper[4753]: I1005 21:00:59.218831 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/455ac15e-ed1d-4399-85fa-9980d811f818-utilities\") pod \"community-operators-58jh5\" (UID: \"455ac15e-ed1d-4399-85fa-9980d811f818\") " pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:00:59 crc kubenswrapper[4753]: I1005 21:00:59.218904 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh6ph\" (UniqueName: \"kubernetes.io/projected/455ac15e-ed1d-4399-85fa-9980d811f818-kube-api-access-nh6ph\") pod \"community-operators-58jh5\" (UID: \"455ac15e-ed1d-4399-85fa-9980d811f818\") " pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:00:59 crc kubenswrapper[4753]: I1005 21:00:59.219631 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/455ac15e-ed1d-4399-85fa-9980d811f818-utilities\") pod \"community-operators-58jh5\" (UID: \"455ac15e-ed1d-4399-85fa-9980d811f818\") " pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:00:59 crc kubenswrapper[4753]: I1005 21:00:59.219635 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/455ac15e-ed1d-4399-85fa-9980d811f818-catalog-content\") pod \"community-operators-58jh5\" (UID: \"455ac15e-ed1d-4399-85fa-9980d811f818\") " pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:00:59 crc kubenswrapper[4753]: I1005 21:00:59.251985 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh6ph\" (UniqueName: \"kubernetes.io/projected/455ac15e-ed1d-4399-85fa-9980d811f818-kube-api-access-nh6ph\") pod \"community-operators-58jh5\" (UID: \"455ac15e-ed1d-4399-85fa-9980d811f818\") " pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:00:59 crc kubenswrapper[4753]: I1005 21:00:59.304997 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:00:59 crc kubenswrapper[4753]: I1005 21:00:59.824167 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-58jh5"] Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.149158 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29328301-rlwxs"] Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.150470 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29328301-rlwxs" Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.161885 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29328301-rlwxs"] Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.337305 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bvcg\" (UniqueName: \"kubernetes.io/projected/383607d3-fca4-477a-a189-c6aab8192496-kube-api-access-2bvcg\") pod \"keystone-cron-29328301-rlwxs\" (UID: \"383607d3-fca4-477a-a189-c6aab8192496\") " pod="openstack/keystone-cron-29328301-rlwxs" Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.337392 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-fernet-keys\") pod \"keystone-cron-29328301-rlwxs\" (UID: \"383607d3-fca4-477a-a189-c6aab8192496\") " pod="openstack/keystone-cron-29328301-rlwxs" Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.337492 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-config-data\") pod \"keystone-cron-29328301-rlwxs\" (UID: \"383607d3-fca4-477a-a189-c6aab8192496\") " pod="openstack/keystone-cron-29328301-rlwxs" Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.337509 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-combined-ca-bundle\") pod \"keystone-cron-29328301-rlwxs\" (UID: \"383607d3-fca4-477a-a189-c6aab8192496\") " pod="openstack/keystone-cron-29328301-rlwxs" Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.439189 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-config-data\") pod \"keystone-cron-29328301-rlwxs\" (UID: \"383607d3-fca4-477a-a189-c6aab8192496\") " pod="openstack/keystone-cron-29328301-rlwxs" Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.439223 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-combined-ca-bundle\") pod \"keystone-cron-29328301-rlwxs\" (UID: \"383607d3-fca4-477a-a189-c6aab8192496\") " pod="openstack/keystone-cron-29328301-rlwxs" Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.439255 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bvcg\" (UniqueName: \"kubernetes.io/projected/383607d3-fca4-477a-a189-c6aab8192496-kube-api-access-2bvcg\") pod \"keystone-cron-29328301-rlwxs\" (UID: \"383607d3-fca4-477a-a189-c6aab8192496\") " pod="openstack/keystone-cron-29328301-rlwxs" Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.439321 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-fernet-keys\") pod \"keystone-cron-29328301-rlwxs\" (UID: \"383607d3-fca4-477a-a189-c6aab8192496\") " pod="openstack/keystone-cron-29328301-rlwxs" Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.446105 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-config-data\") pod \"keystone-cron-29328301-rlwxs\" (UID: \"383607d3-fca4-477a-a189-c6aab8192496\") " pod="openstack/keystone-cron-29328301-rlwxs" Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.446227 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-combined-ca-bundle\") pod \"keystone-cron-29328301-rlwxs\" (UID: \"383607d3-fca4-477a-a189-c6aab8192496\") " pod="openstack/keystone-cron-29328301-rlwxs" Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.448674 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-fernet-keys\") pod \"keystone-cron-29328301-rlwxs\" (UID: \"383607d3-fca4-477a-a189-c6aab8192496\") " pod="openstack/keystone-cron-29328301-rlwxs" Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.455730 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bvcg\" (UniqueName: \"kubernetes.io/projected/383607d3-fca4-477a-a189-c6aab8192496-kube-api-access-2bvcg\") pod \"keystone-cron-29328301-rlwxs\" (UID: \"383607d3-fca4-477a-a189-c6aab8192496\") " pod="openstack/keystone-cron-29328301-rlwxs" Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.464484 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29328301-rlwxs" Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.671668 4753 generic.go:334] "Generic (PLEG): container finished" podID="455ac15e-ed1d-4399-85fa-9980d811f818" containerID="77a041286002bc9b22617f67a704b49ad5ec1baa6587372eafbe1c898d498957" exitCode=0 Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.671734 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58jh5" event={"ID":"455ac15e-ed1d-4399-85fa-9980d811f818","Type":"ContainerDied","Data":"77a041286002bc9b22617f67a704b49ad5ec1baa6587372eafbe1c898d498957"} Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.671764 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58jh5" event={"ID":"455ac15e-ed1d-4399-85fa-9980d811f818","Type":"ContainerStarted","Data":"0ec813d2e87a87c506778101f05560384e6f477a5161ef7890134d96e39332f2"} Oct 05 21:01:00 crc kubenswrapper[4753]: I1005 21:01:00.914411 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29328301-rlwxs"] Oct 05 21:01:01 crc kubenswrapper[4753]: I1005 21:01:01.680994 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58jh5" event={"ID":"455ac15e-ed1d-4399-85fa-9980d811f818","Type":"ContainerStarted","Data":"f8b55ec01735e5965fdc8d5f9d1dcb53172a7a61d8e88d97e667c2591b29fbe8"} Oct 05 21:01:01 crc kubenswrapper[4753]: I1005 21:01:01.682614 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29328301-rlwxs" event={"ID":"383607d3-fca4-477a-a189-c6aab8192496","Type":"ContainerStarted","Data":"7af12edf74b3677fbdbd56f408b57db9b0ecb342d928911d1b42ae2e202a8028"} Oct 05 21:01:01 crc kubenswrapper[4753]: I1005 21:01:01.682642 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29328301-rlwxs" event={"ID":"383607d3-fca4-477a-a189-c6aab8192496","Type":"ContainerStarted","Data":"ec41d41d5e593bbee23bae88247a3d417e2b9902c7d11915a04e1fe451577f23"} Oct 05 21:01:01 crc kubenswrapper[4753]: I1005 21:01:01.718690 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29328301-rlwxs" podStartSLOduration=1.71867504 podStartE2EDuration="1.71867504s" podCreationTimestamp="2025-10-05 21:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 21:01:01.717328169 +0000 UTC m=+2770.565656411" watchObservedRunningTime="2025-10-05 21:01:01.71867504 +0000 UTC m=+2770.567003272" Oct 05 21:01:02 crc kubenswrapper[4753]: I1005 21:01:02.692115 4753 generic.go:334] "Generic (PLEG): container finished" podID="455ac15e-ed1d-4399-85fa-9980d811f818" containerID="f8b55ec01735e5965fdc8d5f9d1dcb53172a7a61d8e88d97e667c2591b29fbe8" exitCode=0 Oct 05 21:01:02 crc kubenswrapper[4753]: I1005 21:01:02.692170 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58jh5" event={"ID":"455ac15e-ed1d-4399-85fa-9980d811f818","Type":"ContainerDied","Data":"f8b55ec01735e5965fdc8d5f9d1dcb53172a7a61d8e88d97e667c2591b29fbe8"} Oct 05 21:01:03 crc kubenswrapper[4753]: I1005 21:01:03.704323 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58jh5" event={"ID":"455ac15e-ed1d-4399-85fa-9980d811f818","Type":"ContainerStarted","Data":"408420c3cac714c87fcb978f5c354786d5a47bb81e9995e9400b90e15c67f66f"} Oct 05 21:01:04 crc kubenswrapper[4753]: I1005 21:01:04.495732 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:01:04 crc kubenswrapper[4753]: I1005 21:01:04.496813 4753 scope.go:117] "RemoveContainer" containerID="a12c5e14e454f061e96cbde4ee73a98879889ea378e7bd61c2821f3a9c3f6b8f" Oct 05 21:01:04 crc kubenswrapper[4753]: I1005 21:01:04.499279 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:01:04 crc kubenswrapper[4753]: I1005 21:01:04.499344 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 21:01:04 crc kubenswrapper[4753]: I1005 21:01:04.499998 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f370bebddf0b48b3eae16a7acc2526579ae4e4e93d5827de3123477139f20060"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 21:01:04 crc kubenswrapper[4753]: I1005 21:01:04.500055 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://f370bebddf0b48b3eae16a7acc2526579ae4e4e93d5827de3123477139f20060" gracePeriod=600 Oct 05 21:01:04 crc kubenswrapper[4753]: I1005 21:01:04.726110 4753 generic.go:334] "Generic (PLEG): container finished" podID="383607d3-fca4-477a-a189-c6aab8192496" containerID="7af12edf74b3677fbdbd56f408b57db9b0ecb342d928911d1b42ae2e202a8028" exitCode=0 Oct 05 21:01:04 crc kubenswrapper[4753]: I1005 21:01:04.726200 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29328301-rlwxs" event={"ID":"383607d3-fca4-477a-a189-c6aab8192496","Type":"ContainerDied","Data":"7af12edf74b3677fbdbd56f408b57db9b0ecb342d928911d1b42ae2e202a8028"} Oct 05 21:01:04 crc kubenswrapper[4753]: I1005 21:01:04.731756 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="f370bebddf0b48b3eae16a7acc2526579ae4e4e93d5827de3123477139f20060" exitCode=0 Oct 05 21:01:04 crc kubenswrapper[4753]: I1005 21:01:04.731858 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"f370bebddf0b48b3eae16a7acc2526579ae4e4e93d5827de3123477139f20060"} Oct 05 21:01:04 crc kubenswrapper[4753]: I1005 21:01:04.731917 4753 scope.go:117] "RemoveContainer" containerID="eae56e1e9c379b643eeead89aa7cb23b6122e9199dab69adc4b77abaa6639b41" Oct 05 21:01:04 crc kubenswrapper[4753]: I1005 21:01:04.753045 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-58jh5" podStartSLOduration=4.292917152 podStartE2EDuration="6.753028143s" podCreationTimestamp="2025-10-05 21:00:58 +0000 UTC" firstStartedPulling="2025-10-05 21:01:00.679437737 +0000 UTC m=+2769.527765979" lastFinishedPulling="2025-10-05 21:01:03.139548728 +0000 UTC m=+2771.987876970" observedRunningTime="2025-10-05 21:01:03.739268362 +0000 UTC m=+2772.587596604" watchObservedRunningTime="2025-10-05 21:01:04.753028143 +0000 UTC m=+2773.601356375" Oct 05 21:01:05 crc kubenswrapper[4753]: I1005 21:01:05.742066 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a"} Oct 05 21:01:06 crc kubenswrapper[4753]: I1005 21:01:06.067527 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29328301-rlwxs" Oct 05 21:01:06 crc kubenswrapper[4753]: I1005 21:01:06.252026 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-config-data\") pod \"383607d3-fca4-477a-a189-c6aab8192496\" (UID: \"383607d3-fca4-477a-a189-c6aab8192496\") " Oct 05 21:01:06 crc kubenswrapper[4753]: I1005 21:01:06.252290 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-fernet-keys\") pod \"383607d3-fca4-477a-a189-c6aab8192496\" (UID: \"383607d3-fca4-477a-a189-c6aab8192496\") " Oct 05 21:01:06 crc kubenswrapper[4753]: I1005 21:01:06.252506 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bvcg\" (UniqueName: \"kubernetes.io/projected/383607d3-fca4-477a-a189-c6aab8192496-kube-api-access-2bvcg\") pod \"383607d3-fca4-477a-a189-c6aab8192496\" (UID: \"383607d3-fca4-477a-a189-c6aab8192496\") " Oct 05 21:01:06 crc kubenswrapper[4753]: I1005 21:01:06.252570 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-combined-ca-bundle\") pod \"383607d3-fca4-477a-a189-c6aab8192496\" (UID: \"383607d3-fca4-477a-a189-c6aab8192496\") " Oct 05 21:01:06 crc kubenswrapper[4753]: I1005 21:01:06.259894 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/383607d3-fca4-477a-a189-c6aab8192496-kube-api-access-2bvcg" (OuterVolumeSpecName: "kube-api-access-2bvcg") pod "383607d3-fca4-477a-a189-c6aab8192496" (UID: "383607d3-fca4-477a-a189-c6aab8192496"). InnerVolumeSpecName "kube-api-access-2bvcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:01:06 crc kubenswrapper[4753]: I1005 21:01:06.260215 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "383607d3-fca4-477a-a189-c6aab8192496" (UID: "383607d3-fca4-477a-a189-c6aab8192496"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:01:06 crc kubenswrapper[4753]: I1005 21:01:06.279530 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "383607d3-fca4-477a-a189-c6aab8192496" (UID: "383607d3-fca4-477a-a189-c6aab8192496"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:01:06 crc kubenswrapper[4753]: I1005 21:01:06.338200 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-config-data" (OuterVolumeSpecName: "config-data") pod "383607d3-fca4-477a-a189-c6aab8192496" (UID: "383607d3-fca4-477a-a189-c6aab8192496"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:01:06 crc kubenswrapper[4753]: I1005 21:01:06.354493 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 21:01:06 crc kubenswrapper[4753]: I1005 21:01:06.354751 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 21:01:06 crc kubenswrapper[4753]: I1005 21:01:06.354844 4753 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/383607d3-fca4-477a-a189-c6aab8192496-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 05 21:01:06 crc kubenswrapper[4753]: I1005 21:01:06.354938 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bvcg\" (UniqueName: \"kubernetes.io/projected/383607d3-fca4-477a-a189-c6aab8192496-kube-api-access-2bvcg\") on node \"crc\" DevicePath \"\"" Oct 05 21:01:06 crc kubenswrapper[4753]: I1005 21:01:06.751127 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29328301-rlwxs" event={"ID":"383607d3-fca4-477a-a189-c6aab8192496","Type":"ContainerDied","Data":"ec41d41d5e593bbee23bae88247a3d417e2b9902c7d11915a04e1fe451577f23"} Oct 05 21:01:06 crc kubenswrapper[4753]: I1005 21:01:06.751415 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec41d41d5e593bbee23bae88247a3d417e2b9902c7d11915a04e1fe451577f23" Oct 05 21:01:06 crc kubenswrapper[4753]: I1005 21:01:06.751177 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29328301-rlwxs" Oct 05 21:01:09 crc kubenswrapper[4753]: I1005 21:01:09.305732 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:01:09 crc kubenswrapper[4753]: I1005 21:01:09.306159 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:01:09 crc kubenswrapper[4753]: I1005 21:01:09.357634 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:01:09 crc kubenswrapper[4753]: I1005 21:01:09.836376 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:01:09 crc kubenswrapper[4753]: I1005 21:01:09.893513 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-58jh5"] Oct 05 21:01:11 crc kubenswrapper[4753]: I1005 21:01:11.792538 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-58jh5" podUID="455ac15e-ed1d-4399-85fa-9980d811f818" containerName="registry-server" containerID="cri-o://408420c3cac714c87fcb978f5c354786d5a47bb81e9995e9400b90e15c67f66f" gracePeriod=2 Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.271549 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.461433 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/455ac15e-ed1d-4399-85fa-9980d811f818-utilities\") pod \"455ac15e-ed1d-4399-85fa-9980d811f818\" (UID: \"455ac15e-ed1d-4399-85fa-9980d811f818\") " Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.461608 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/455ac15e-ed1d-4399-85fa-9980d811f818-catalog-content\") pod \"455ac15e-ed1d-4399-85fa-9980d811f818\" (UID: \"455ac15e-ed1d-4399-85fa-9980d811f818\") " Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.461765 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh6ph\" (UniqueName: \"kubernetes.io/projected/455ac15e-ed1d-4399-85fa-9980d811f818-kube-api-access-nh6ph\") pod \"455ac15e-ed1d-4399-85fa-9980d811f818\" (UID: \"455ac15e-ed1d-4399-85fa-9980d811f818\") " Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.464637 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/455ac15e-ed1d-4399-85fa-9980d811f818-utilities" (OuterVolumeSpecName: "utilities") pod "455ac15e-ed1d-4399-85fa-9980d811f818" (UID: "455ac15e-ed1d-4399-85fa-9980d811f818"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.479399 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/455ac15e-ed1d-4399-85fa-9980d811f818-kube-api-access-nh6ph" (OuterVolumeSpecName: "kube-api-access-nh6ph") pod "455ac15e-ed1d-4399-85fa-9980d811f818" (UID: "455ac15e-ed1d-4399-85fa-9980d811f818"). InnerVolumeSpecName "kube-api-access-nh6ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.521882 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/455ac15e-ed1d-4399-85fa-9980d811f818-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "455ac15e-ed1d-4399-85fa-9980d811f818" (UID: "455ac15e-ed1d-4399-85fa-9980d811f818"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.564283 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/455ac15e-ed1d-4399-85fa-9980d811f818-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.564332 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh6ph\" (UniqueName: \"kubernetes.io/projected/455ac15e-ed1d-4399-85fa-9980d811f818-kube-api-access-nh6ph\") on node \"crc\" DevicePath \"\"" Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.564344 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/455ac15e-ed1d-4399-85fa-9980d811f818-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.804001 4753 generic.go:334] "Generic (PLEG): container finished" podID="455ac15e-ed1d-4399-85fa-9980d811f818" containerID="408420c3cac714c87fcb978f5c354786d5a47bb81e9995e9400b90e15c67f66f" exitCode=0 Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.804044 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58jh5" event={"ID":"455ac15e-ed1d-4399-85fa-9980d811f818","Type":"ContainerDied","Data":"408420c3cac714c87fcb978f5c354786d5a47bb81e9995e9400b90e15c67f66f"} Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.804069 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58jh5" event={"ID":"455ac15e-ed1d-4399-85fa-9980d811f818","Type":"ContainerDied","Data":"0ec813d2e87a87c506778101f05560384e6f477a5161ef7890134d96e39332f2"} Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.804085 4753 scope.go:117] "RemoveContainer" containerID="408420c3cac714c87fcb978f5c354786d5a47bb81e9995e9400b90e15c67f66f" Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.804237 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-58jh5" Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.840155 4753 scope.go:117] "RemoveContainer" containerID="f8b55ec01735e5965fdc8d5f9d1dcb53172a7a61d8e88d97e667c2591b29fbe8" Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.842066 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-58jh5"] Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.854790 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-58jh5"] Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.865129 4753 scope.go:117] "RemoveContainer" containerID="77a041286002bc9b22617f67a704b49ad5ec1baa6587372eafbe1c898d498957" Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.906761 4753 scope.go:117] "RemoveContainer" containerID="408420c3cac714c87fcb978f5c354786d5a47bb81e9995e9400b90e15c67f66f" Oct 05 21:01:12 crc kubenswrapper[4753]: E1005 21:01:12.907646 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408420c3cac714c87fcb978f5c354786d5a47bb81e9995e9400b90e15c67f66f\": container with ID starting with 408420c3cac714c87fcb978f5c354786d5a47bb81e9995e9400b90e15c67f66f not found: ID does not exist" containerID="408420c3cac714c87fcb978f5c354786d5a47bb81e9995e9400b90e15c67f66f" Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.907685 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408420c3cac714c87fcb978f5c354786d5a47bb81e9995e9400b90e15c67f66f"} err="failed to get container status \"408420c3cac714c87fcb978f5c354786d5a47bb81e9995e9400b90e15c67f66f\": rpc error: code = NotFound desc = could not find container \"408420c3cac714c87fcb978f5c354786d5a47bb81e9995e9400b90e15c67f66f\": container with ID starting with 408420c3cac714c87fcb978f5c354786d5a47bb81e9995e9400b90e15c67f66f not found: ID does not exist" Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.907721 4753 scope.go:117] "RemoveContainer" containerID="f8b55ec01735e5965fdc8d5f9d1dcb53172a7a61d8e88d97e667c2591b29fbe8" Oct 05 21:01:12 crc kubenswrapper[4753]: E1005 21:01:12.907993 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8b55ec01735e5965fdc8d5f9d1dcb53172a7a61d8e88d97e667c2591b29fbe8\": container with ID starting with f8b55ec01735e5965fdc8d5f9d1dcb53172a7a61d8e88d97e667c2591b29fbe8 not found: ID does not exist" containerID="f8b55ec01735e5965fdc8d5f9d1dcb53172a7a61d8e88d97e667c2591b29fbe8" Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.908034 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b55ec01735e5965fdc8d5f9d1dcb53172a7a61d8e88d97e667c2591b29fbe8"} err="failed to get container status \"f8b55ec01735e5965fdc8d5f9d1dcb53172a7a61d8e88d97e667c2591b29fbe8\": rpc error: code = NotFound desc = could not find container \"f8b55ec01735e5965fdc8d5f9d1dcb53172a7a61d8e88d97e667c2591b29fbe8\": container with ID starting with f8b55ec01735e5965fdc8d5f9d1dcb53172a7a61d8e88d97e667c2591b29fbe8 not found: ID does not exist" Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.908063 4753 scope.go:117] "RemoveContainer" containerID="77a041286002bc9b22617f67a704b49ad5ec1baa6587372eafbe1c898d498957" Oct 05 21:01:12 crc kubenswrapper[4753]: E1005 21:01:12.908518 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a041286002bc9b22617f67a704b49ad5ec1baa6587372eafbe1c898d498957\": container with ID starting with 77a041286002bc9b22617f67a704b49ad5ec1baa6587372eafbe1c898d498957 not found: ID does not exist" containerID="77a041286002bc9b22617f67a704b49ad5ec1baa6587372eafbe1c898d498957" Oct 05 21:01:12 crc kubenswrapper[4753]: I1005 21:01:12.908543 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a041286002bc9b22617f67a704b49ad5ec1baa6587372eafbe1c898d498957"} err="failed to get container status \"77a041286002bc9b22617f67a704b49ad5ec1baa6587372eafbe1c898d498957\": rpc error: code = NotFound desc = could not find container \"77a041286002bc9b22617f67a704b49ad5ec1baa6587372eafbe1c898d498957\": container with ID starting with 77a041286002bc9b22617f67a704b49ad5ec1baa6587372eafbe1c898d498957 not found: ID does not exist" Oct 05 21:01:13 crc kubenswrapper[4753]: I1005 21:01:13.867602 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="455ac15e-ed1d-4399-85fa-9980d811f818" path="/var/lib/kubelet/pods/455ac15e-ed1d-4399-85fa-9980d811f818/volumes" Oct 05 21:03:04 crc kubenswrapper[4753]: I1005 21:03:04.489931 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:03:04 crc kubenswrapper[4753]: I1005 21:03:04.490641 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:03:34 crc kubenswrapper[4753]: I1005 21:03:34.490792 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:03:34 crc kubenswrapper[4753]: I1005 21:03:34.491686 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:04:04 crc kubenswrapper[4753]: I1005 21:04:04.490702 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:04:04 crc kubenswrapper[4753]: I1005 21:04:04.491240 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:04:04 crc kubenswrapper[4753]: I1005 21:04:04.491289 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 21:04:04 crc kubenswrapper[4753]: I1005 21:04:04.491973 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 21:04:04 crc kubenswrapper[4753]: I1005 21:04:04.492017 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" gracePeriod=600 Oct 05 21:04:04 crc kubenswrapper[4753]: E1005 21:04:04.620408 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:04:05 crc kubenswrapper[4753]: I1005 21:04:05.367059 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" exitCode=0 Oct 05 21:04:05 crc kubenswrapper[4753]: I1005 21:04:05.367145 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a"} Oct 05 21:04:05 crc kubenswrapper[4753]: I1005 21:04:05.367469 4753 scope.go:117] "RemoveContainer" containerID="f370bebddf0b48b3eae16a7acc2526579ae4e4e93d5827de3123477139f20060" Oct 05 21:04:05 crc kubenswrapper[4753]: I1005 21:04:05.368732 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:04:05 crc kubenswrapper[4753]: E1005 21:04:05.369196 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:04:15 crc kubenswrapper[4753]: I1005 21:04:15.852231 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:04:15 crc kubenswrapper[4753]: E1005 21:04:15.854465 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:04:29 crc kubenswrapper[4753]: I1005 21:04:29.853005 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:04:29 crc kubenswrapper[4753]: E1005 21:04:29.853773 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:04:42 crc kubenswrapper[4753]: I1005 21:04:42.851949 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:04:42 crc kubenswrapper[4753]: E1005 21:04:42.852990 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:04:55 crc kubenswrapper[4753]: I1005 21:04:55.852704 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:04:55 crc kubenswrapper[4753]: E1005 21:04:55.853455 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:05:06 crc kubenswrapper[4753]: I1005 21:05:06.852793 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:05:06 crc kubenswrapper[4753]: E1005 21:05:06.853507 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:05:09 crc kubenswrapper[4753]: I1005 21:05:09.905200 4753 generic.go:334] "Generic (PLEG): container finished" podID="9f393cda-bc70-44d4-a534-a72b71dcf0b7" containerID="e0f58d99465c2d2291b066e8402a2af224474499d2dacec991eed7e7fe2b4f93" exitCode=0 Oct 05 21:05:09 crc kubenswrapper[4753]: I1005 21:05:09.905297 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" event={"ID":"9f393cda-bc70-44d4-a534-a72b71dcf0b7","Type":"ContainerDied","Data":"e0f58d99465c2d2291b066e8402a2af224474499d2dacec991eed7e7fe2b4f93"} Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.412486 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.610328 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knqv5\" (UniqueName: \"kubernetes.io/projected/9f393cda-bc70-44d4-a534-a72b71dcf0b7-kube-api-access-knqv5\") pod \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.610459 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-inventory\") pod \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.610499 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-libvirt-combined-ca-bundle\") pod \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.610521 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-libvirt-secret-0\") pod \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.610555 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-ssh-key\") pod \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.610589 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-ceph\") pod \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\" (UID: \"9f393cda-bc70-44d4-a534-a72b71dcf0b7\") " Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.615594 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-ceph" (OuterVolumeSpecName: "ceph") pod "9f393cda-bc70-44d4-a534-a72b71dcf0b7" (UID: "9f393cda-bc70-44d4-a534-a72b71dcf0b7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.616042 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f393cda-bc70-44d4-a534-a72b71dcf0b7-kube-api-access-knqv5" (OuterVolumeSpecName: "kube-api-access-knqv5") pod "9f393cda-bc70-44d4-a534-a72b71dcf0b7" (UID: "9f393cda-bc70-44d4-a534-a72b71dcf0b7"). InnerVolumeSpecName "kube-api-access-knqv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.617616 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9f393cda-bc70-44d4-a534-a72b71dcf0b7" (UID: "9f393cda-bc70-44d4-a534-a72b71dcf0b7"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.645637 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9f393cda-bc70-44d4-a534-a72b71dcf0b7" (UID: "9f393cda-bc70-44d4-a534-a72b71dcf0b7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.648156 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-inventory" (OuterVolumeSpecName: "inventory") pod "9f393cda-bc70-44d4-a534-a72b71dcf0b7" (UID: "9f393cda-bc70-44d4-a534-a72b71dcf0b7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.655417 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "9f393cda-bc70-44d4-a534-a72b71dcf0b7" (UID: "9f393cda-bc70-44d4-a534-a72b71dcf0b7"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.712957 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.712998 4753 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.713012 4753 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.713223 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.713232 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9f393cda-bc70-44d4-a534-a72b71dcf0b7-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.713240 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knqv5\" (UniqueName: \"kubernetes.io/projected/9f393cda-bc70-44d4-a534-a72b71dcf0b7-kube-api-access-knqv5\") on node \"crc\" DevicePath \"\"" Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.930962 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" event={"ID":"9f393cda-bc70-44d4-a534-a72b71dcf0b7","Type":"ContainerDied","Data":"880ba9eaf6af9e5780df90ed80d6c53430b03efb3868cdeeb2a443e0f3ee7445"} Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.931342 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="880ba9eaf6af9e5780df90ed80d6c53430b03efb3868cdeeb2a443e0f3ee7445" Oct 05 21:05:11 crc kubenswrapper[4753]: I1005 21:05:11.930997 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrg62" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.015326 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk"] Oct 05 21:05:12 crc kubenswrapper[4753]: E1005 21:05:12.015884 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383607d3-fca4-477a-a189-c6aab8192496" containerName="keystone-cron" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.015973 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="383607d3-fca4-477a-a189-c6aab8192496" containerName="keystone-cron" Oct 05 21:05:12 crc kubenswrapper[4753]: E1005 21:05:12.016051 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f393cda-bc70-44d4-a534-a72b71dcf0b7" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.016111 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f393cda-bc70-44d4-a534-a72b71dcf0b7" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 05 21:05:12 crc kubenswrapper[4753]: E1005 21:05:12.016205 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455ac15e-ed1d-4399-85fa-9980d811f818" containerName="extract-utilities" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.016259 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="455ac15e-ed1d-4399-85fa-9980d811f818" containerName="extract-utilities" Oct 05 21:05:12 crc kubenswrapper[4753]: E1005 21:05:12.016329 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455ac15e-ed1d-4399-85fa-9980d811f818" containerName="registry-server" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.016422 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="455ac15e-ed1d-4399-85fa-9980d811f818" containerName="registry-server" Oct 05 21:05:12 crc kubenswrapper[4753]: E1005 21:05:12.016499 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455ac15e-ed1d-4399-85fa-9980d811f818" containerName="extract-content" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.016553 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="455ac15e-ed1d-4399-85fa-9980d811f818" containerName="extract-content" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.016766 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="455ac15e-ed1d-4399-85fa-9980d811f818" containerName="registry-server" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.016861 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f393cda-bc70-44d4-a534-a72b71dcf0b7" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.016936 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="383607d3-fca4-477a-a189-c6aab8192496" containerName="keystone-cron" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.017653 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.021714 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.021806 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.022334 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vfbvc" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.023009 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.023224 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.023612 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.024282 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.024509 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.024676 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.034166 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk"] Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.120472 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.120554 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.120608 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm62b\" (UniqueName: \"kubernetes.io/projected/db911cf0-3e57-45a3-a1ce-06f5260745b4-kube-api-access-dm62b\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.120678 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.120714 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/db911cf0-3e57-45a3-a1ce-06f5260745b4-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.120753 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.120878 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.120926 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.120977 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.121179 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.121237 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.221713 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.221755 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.221796 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.221818 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.221841 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm62b\" (UniqueName: \"kubernetes.io/projected/db911cf0-3e57-45a3-a1ce-06f5260745b4-kube-api-access-dm62b\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.221870 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.221888 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/db911cf0-3e57-45a3-a1ce-06f5260745b4-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.221913 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.221940 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.221957 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.221976 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.223442 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.225500 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.226362 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.227320 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.229601 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.229942 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/db911cf0-3e57-45a3-a1ce-06f5260745b4-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.230155 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.230715 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.230988 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.231392 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.242566 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm62b\" (UniqueName: \"kubernetes.io/projected/db911cf0-3e57-45a3-a1ce-06f5260745b4-kube-api-access-dm62b\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.337865 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.903282 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk"] Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.913199 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 05 21:05:12 crc kubenswrapper[4753]: I1005 21:05:12.944438 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" event={"ID":"db911cf0-3e57-45a3-a1ce-06f5260745b4","Type":"ContainerStarted","Data":"05a1e3a31b81d34110f35d9bbf335347770be88b2a3681c6eca860acb4174ad9"} Oct 05 21:05:13 crc kubenswrapper[4753]: I1005 21:05:13.956370 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" event={"ID":"db911cf0-3e57-45a3-a1ce-06f5260745b4","Type":"ContainerStarted","Data":"f0c5e0c9f9d3e32e53868f123671bac711deb814384788d4c81bd7ed4412bf04"} Oct 05 21:05:21 crc kubenswrapper[4753]: I1005 21:05:21.860208 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:05:21 crc kubenswrapper[4753]: E1005 21:05:21.860945 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:05:36 crc kubenswrapper[4753]: I1005 21:05:36.852367 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:05:36 crc kubenswrapper[4753]: E1005 21:05:36.853290 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:05:47 crc kubenswrapper[4753]: I1005 21:05:47.851819 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:05:47 crc kubenswrapper[4753]: E1005 21:05:47.852470 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:06:02 crc kubenswrapper[4753]: I1005 21:06:02.852917 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:06:02 crc kubenswrapper[4753]: E1005 21:06:02.854380 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:06:13 crc kubenswrapper[4753]: I1005 21:06:13.851806 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:06:13 crc kubenswrapper[4753]: E1005 21:06:13.853474 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:06:28 crc kubenswrapper[4753]: I1005 21:06:28.862050 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:06:28 crc kubenswrapper[4753]: E1005 21:06:28.863402 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:06:40 crc kubenswrapper[4753]: I1005 21:06:40.852312 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:06:40 crc kubenswrapper[4753]: E1005 21:06:40.853110 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:06:52 crc kubenswrapper[4753]: I1005 21:06:52.852003 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:06:52 crc kubenswrapper[4753]: E1005 21:06:52.852696 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:07:05 crc kubenswrapper[4753]: I1005 21:07:05.852290 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:07:05 crc kubenswrapper[4753]: E1005 21:07:05.853200 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:07:17 crc kubenswrapper[4753]: I1005 21:07:17.852747 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:07:17 crc kubenswrapper[4753]: E1005 21:07:17.853654 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:07:31 crc kubenswrapper[4753]: I1005 21:07:31.862691 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:07:31 crc kubenswrapper[4753]: E1005 21:07:31.864556 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:07:42 crc kubenswrapper[4753]: I1005 21:07:42.852379 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:07:42 crc kubenswrapper[4753]: E1005 21:07:42.853159 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:07:53 crc kubenswrapper[4753]: I1005 21:07:53.852866 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:07:53 crc kubenswrapper[4753]: E1005 21:07:53.853636 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:08:06 crc kubenswrapper[4753]: I1005 21:08:06.851798 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:08:06 crc kubenswrapper[4753]: E1005 21:08:06.853556 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:08:18 crc kubenswrapper[4753]: I1005 21:08:18.852572 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:08:18 crc kubenswrapper[4753]: E1005 21:08:18.853403 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:08:33 crc kubenswrapper[4753]: I1005 21:08:33.852282 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:08:33 crc kubenswrapper[4753]: E1005 21:08:33.852893 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:08:43 crc kubenswrapper[4753]: I1005 21:08:43.369122 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" podStartSLOduration=211.873013445 podStartE2EDuration="3m32.36910528s" podCreationTimestamp="2025-10-05 21:05:11 +0000 UTC" firstStartedPulling="2025-10-05 21:05:12.91295068 +0000 UTC m=+3021.761278912" lastFinishedPulling="2025-10-05 21:05:13.409042515 +0000 UTC m=+3022.257370747" observedRunningTime="2025-10-05 21:05:13.993611863 +0000 UTC m=+3022.841940105" watchObservedRunningTime="2025-10-05 21:08:43.36910528 +0000 UTC m=+3232.217433512" Oct 05 21:08:43 crc kubenswrapper[4753]: I1005 21:08:43.374499 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wxzvw"] Oct 05 21:08:43 crc kubenswrapper[4753]: I1005 21:08:43.376320 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:43 crc kubenswrapper[4753]: I1005 21:08:43.392931 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wxzvw"] Oct 05 21:08:43 crc kubenswrapper[4753]: I1005 21:08:43.534426 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e13df3-1c57-46bb-98be-9628284ae7e6-utilities\") pod \"certified-operators-wxzvw\" (UID: \"a5e13df3-1c57-46bb-98be-9628284ae7e6\") " pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:43 crc kubenswrapper[4753]: I1005 21:08:43.534798 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e13df3-1c57-46bb-98be-9628284ae7e6-catalog-content\") pod \"certified-operators-wxzvw\" (UID: \"a5e13df3-1c57-46bb-98be-9628284ae7e6\") " pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:43 crc kubenswrapper[4753]: I1005 21:08:43.534883 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9txb6\" (UniqueName: \"kubernetes.io/projected/a5e13df3-1c57-46bb-98be-9628284ae7e6-kube-api-access-9txb6\") pod \"certified-operators-wxzvw\" (UID: \"a5e13df3-1c57-46bb-98be-9628284ae7e6\") " pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:43 crc kubenswrapper[4753]: I1005 21:08:43.636357 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e13df3-1c57-46bb-98be-9628284ae7e6-utilities\") pod \"certified-operators-wxzvw\" (UID: \"a5e13df3-1c57-46bb-98be-9628284ae7e6\") " pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:43 crc kubenswrapper[4753]: I1005 21:08:43.636436 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e13df3-1c57-46bb-98be-9628284ae7e6-catalog-content\") pod \"certified-operators-wxzvw\" (UID: \"a5e13df3-1c57-46bb-98be-9628284ae7e6\") " pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:43 crc kubenswrapper[4753]: I1005 21:08:43.636507 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9txb6\" (UniqueName: \"kubernetes.io/projected/a5e13df3-1c57-46bb-98be-9628284ae7e6-kube-api-access-9txb6\") pod \"certified-operators-wxzvw\" (UID: \"a5e13df3-1c57-46bb-98be-9628284ae7e6\") " pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:43 crc kubenswrapper[4753]: I1005 21:08:43.637124 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e13df3-1c57-46bb-98be-9628284ae7e6-utilities\") pod \"certified-operators-wxzvw\" (UID: \"a5e13df3-1c57-46bb-98be-9628284ae7e6\") " pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:43 crc kubenswrapper[4753]: I1005 21:08:43.637272 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e13df3-1c57-46bb-98be-9628284ae7e6-catalog-content\") pod \"certified-operators-wxzvw\" (UID: \"a5e13df3-1c57-46bb-98be-9628284ae7e6\") " pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:43 crc kubenswrapper[4753]: I1005 21:08:43.657022 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9txb6\" (UniqueName: \"kubernetes.io/projected/a5e13df3-1c57-46bb-98be-9628284ae7e6-kube-api-access-9txb6\") pod \"certified-operators-wxzvw\" (UID: \"a5e13df3-1c57-46bb-98be-9628284ae7e6\") " pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:43 crc kubenswrapper[4753]: I1005 21:08:43.743552 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:44 crc kubenswrapper[4753]: I1005 21:08:44.289707 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wxzvw"] Oct 05 21:08:44 crc kubenswrapper[4753]: I1005 21:08:44.877954 4753 generic.go:334] "Generic (PLEG): container finished" podID="a5e13df3-1c57-46bb-98be-9628284ae7e6" containerID="c3b9cf8728e74e4385303e229c429af475e6887e5a127aa3183e9369cfa5e644" exitCode=0 Oct 05 21:08:44 crc kubenswrapper[4753]: I1005 21:08:44.878224 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxzvw" event={"ID":"a5e13df3-1c57-46bb-98be-9628284ae7e6","Type":"ContainerDied","Data":"c3b9cf8728e74e4385303e229c429af475e6887e5a127aa3183e9369cfa5e644"} Oct 05 21:08:44 crc kubenswrapper[4753]: I1005 21:08:44.878259 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxzvw" event={"ID":"a5e13df3-1c57-46bb-98be-9628284ae7e6","Type":"ContainerStarted","Data":"e1f4ec0a4406afa8a31f83b3029dbbd4102e719e6d714346cfbf73b334372942"} Oct 05 21:08:45 crc kubenswrapper[4753]: I1005 21:08:45.888580 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxzvw" event={"ID":"a5e13df3-1c57-46bb-98be-9628284ae7e6","Type":"ContainerStarted","Data":"7205fdacb925cdaeaf9c91c760626a242ae1d8b4ba56a57a7568ea0520cfff3d"} Oct 05 21:08:46 crc kubenswrapper[4753]: I1005 21:08:46.852166 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:08:46 crc kubenswrapper[4753]: E1005 21:08:46.852468 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:08:46 crc kubenswrapper[4753]: I1005 21:08:46.905969 4753 generic.go:334] "Generic (PLEG): container finished" podID="a5e13df3-1c57-46bb-98be-9628284ae7e6" containerID="7205fdacb925cdaeaf9c91c760626a242ae1d8b4ba56a57a7568ea0520cfff3d" exitCode=0 Oct 05 21:08:46 crc kubenswrapper[4753]: I1005 21:08:46.906004 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxzvw" event={"ID":"a5e13df3-1c57-46bb-98be-9628284ae7e6","Type":"ContainerDied","Data":"7205fdacb925cdaeaf9c91c760626a242ae1d8b4ba56a57a7568ea0520cfff3d"} Oct 05 21:08:47 crc kubenswrapper[4753]: I1005 21:08:47.916297 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxzvw" event={"ID":"a5e13df3-1c57-46bb-98be-9628284ae7e6","Type":"ContainerStarted","Data":"9be915d92aef4f135f97a1e80c4d4b2f3edaa972181d9e3fa71782529caf1f4f"} Oct 05 21:08:47 crc kubenswrapper[4753]: I1005 21:08:47.938741 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wxzvw" podStartSLOduration=2.501443304 podStartE2EDuration="4.938722375s" podCreationTimestamp="2025-10-05 21:08:43 +0000 UTC" firstStartedPulling="2025-10-05 21:08:44.880460769 +0000 UTC m=+3233.728789001" lastFinishedPulling="2025-10-05 21:08:47.31773984 +0000 UTC m=+3236.166068072" observedRunningTime="2025-10-05 21:08:47.934106773 +0000 UTC m=+3236.782435005" watchObservedRunningTime="2025-10-05 21:08:47.938722375 +0000 UTC m=+3236.787050607" Oct 05 21:08:53 crc kubenswrapper[4753]: I1005 21:08:53.744679 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:53 crc kubenswrapper[4753]: I1005 21:08:53.745256 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:53 crc kubenswrapper[4753]: I1005 21:08:53.810435 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:54 crc kubenswrapper[4753]: I1005 21:08:54.014852 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:54 crc kubenswrapper[4753]: I1005 21:08:54.059798 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wxzvw"] Oct 05 21:08:55 crc kubenswrapper[4753]: I1005 21:08:55.990454 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wxzvw" podUID="a5e13df3-1c57-46bb-98be-9628284ae7e6" containerName="registry-server" containerID="cri-o://9be915d92aef4f135f97a1e80c4d4b2f3edaa972181d9e3fa71782529caf1f4f" gracePeriod=2 Oct 05 21:08:56 crc kubenswrapper[4753]: I1005 21:08:56.424995 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:56 crc kubenswrapper[4753]: I1005 21:08:56.583912 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e13df3-1c57-46bb-98be-9628284ae7e6-utilities\") pod \"a5e13df3-1c57-46bb-98be-9628284ae7e6\" (UID: \"a5e13df3-1c57-46bb-98be-9628284ae7e6\") " Oct 05 21:08:56 crc kubenswrapper[4753]: I1005 21:08:56.583956 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9txb6\" (UniqueName: \"kubernetes.io/projected/a5e13df3-1c57-46bb-98be-9628284ae7e6-kube-api-access-9txb6\") pod \"a5e13df3-1c57-46bb-98be-9628284ae7e6\" (UID: \"a5e13df3-1c57-46bb-98be-9628284ae7e6\") " Oct 05 21:08:56 crc kubenswrapper[4753]: I1005 21:08:56.584086 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e13df3-1c57-46bb-98be-9628284ae7e6-catalog-content\") pod \"a5e13df3-1c57-46bb-98be-9628284ae7e6\" (UID: \"a5e13df3-1c57-46bb-98be-9628284ae7e6\") " Oct 05 21:08:56 crc kubenswrapper[4753]: I1005 21:08:56.587355 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e13df3-1c57-46bb-98be-9628284ae7e6-utilities" (OuterVolumeSpecName: "utilities") pod "a5e13df3-1c57-46bb-98be-9628284ae7e6" (UID: "a5e13df3-1c57-46bb-98be-9628284ae7e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:08:56 crc kubenswrapper[4753]: I1005 21:08:56.591295 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e13df3-1c57-46bb-98be-9628284ae7e6-kube-api-access-9txb6" (OuterVolumeSpecName: "kube-api-access-9txb6") pod "a5e13df3-1c57-46bb-98be-9628284ae7e6" (UID: "a5e13df3-1c57-46bb-98be-9628284ae7e6"). InnerVolumeSpecName "kube-api-access-9txb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:08:56 crc kubenswrapper[4753]: I1005 21:08:56.629624 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e13df3-1c57-46bb-98be-9628284ae7e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5e13df3-1c57-46bb-98be-9628284ae7e6" (UID: "a5e13df3-1c57-46bb-98be-9628284ae7e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:08:56 crc kubenswrapper[4753]: I1005 21:08:56.686464 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e13df3-1c57-46bb-98be-9628284ae7e6-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 21:08:56 crc kubenswrapper[4753]: I1005 21:08:56.686492 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9txb6\" (UniqueName: \"kubernetes.io/projected/a5e13df3-1c57-46bb-98be-9628284ae7e6-kube-api-access-9txb6\") on node \"crc\" DevicePath \"\"" Oct 05 21:08:56 crc kubenswrapper[4753]: I1005 21:08:56.686502 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e13df3-1c57-46bb-98be-9628284ae7e6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 21:08:56 crc kubenswrapper[4753]: I1005 21:08:56.999626 4753 generic.go:334] "Generic (PLEG): container finished" podID="a5e13df3-1c57-46bb-98be-9628284ae7e6" containerID="9be915d92aef4f135f97a1e80c4d4b2f3edaa972181d9e3fa71782529caf1f4f" exitCode=0 Oct 05 21:08:56 crc kubenswrapper[4753]: I1005 21:08:56.999669 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxzvw" event={"ID":"a5e13df3-1c57-46bb-98be-9628284ae7e6","Type":"ContainerDied","Data":"9be915d92aef4f135f97a1e80c4d4b2f3edaa972181d9e3fa71782529caf1f4f"} Oct 05 21:08:57 crc kubenswrapper[4753]: I1005 21:08:56.999697 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxzvw" event={"ID":"a5e13df3-1c57-46bb-98be-9628284ae7e6","Type":"ContainerDied","Data":"e1f4ec0a4406afa8a31f83b3029dbbd4102e719e6d714346cfbf73b334372942"} Oct 05 21:08:57 crc kubenswrapper[4753]: I1005 21:08:56.999699 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wxzvw" Oct 05 21:08:57 crc kubenswrapper[4753]: I1005 21:08:56.999714 4753 scope.go:117] "RemoveContainer" containerID="9be915d92aef4f135f97a1e80c4d4b2f3edaa972181d9e3fa71782529caf1f4f" Oct 05 21:08:57 crc kubenswrapper[4753]: I1005 21:08:57.021712 4753 scope.go:117] "RemoveContainer" containerID="7205fdacb925cdaeaf9c91c760626a242ae1d8b4ba56a57a7568ea0520cfff3d" Oct 05 21:08:57 crc kubenswrapper[4753]: I1005 21:08:57.032068 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wxzvw"] Oct 05 21:08:57 crc kubenswrapper[4753]: I1005 21:08:57.046255 4753 scope.go:117] "RemoveContainer" containerID="c3b9cf8728e74e4385303e229c429af475e6887e5a127aa3183e9369cfa5e644" Oct 05 21:08:57 crc kubenswrapper[4753]: I1005 21:08:57.053216 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wxzvw"] Oct 05 21:08:57 crc kubenswrapper[4753]: I1005 21:08:57.085496 4753 scope.go:117] "RemoveContainer" containerID="9be915d92aef4f135f97a1e80c4d4b2f3edaa972181d9e3fa71782529caf1f4f" Oct 05 21:08:57 crc kubenswrapper[4753]: E1005 21:08:57.086174 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9be915d92aef4f135f97a1e80c4d4b2f3edaa972181d9e3fa71782529caf1f4f\": container with ID starting with 9be915d92aef4f135f97a1e80c4d4b2f3edaa972181d9e3fa71782529caf1f4f not found: ID does not exist" containerID="9be915d92aef4f135f97a1e80c4d4b2f3edaa972181d9e3fa71782529caf1f4f" Oct 05 21:08:57 crc kubenswrapper[4753]: I1005 21:08:57.086210 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be915d92aef4f135f97a1e80c4d4b2f3edaa972181d9e3fa71782529caf1f4f"} err="failed to get container status \"9be915d92aef4f135f97a1e80c4d4b2f3edaa972181d9e3fa71782529caf1f4f\": rpc error: code = NotFound desc = could not find container \"9be915d92aef4f135f97a1e80c4d4b2f3edaa972181d9e3fa71782529caf1f4f\": container with ID starting with 9be915d92aef4f135f97a1e80c4d4b2f3edaa972181d9e3fa71782529caf1f4f not found: ID does not exist" Oct 05 21:08:57 crc kubenswrapper[4753]: I1005 21:08:57.086238 4753 scope.go:117] "RemoveContainer" containerID="7205fdacb925cdaeaf9c91c760626a242ae1d8b4ba56a57a7568ea0520cfff3d" Oct 05 21:08:57 crc kubenswrapper[4753]: E1005 21:08:57.086588 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7205fdacb925cdaeaf9c91c760626a242ae1d8b4ba56a57a7568ea0520cfff3d\": container with ID starting with 7205fdacb925cdaeaf9c91c760626a242ae1d8b4ba56a57a7568ea0520cfff3d not found: ID does not exist" containerID="7205fdacb925cdaeaf9c91c760626a242ae1d8b4ba56a57a7568ea0520cfff3d" Oct 05 21:08:57 crc kubenswrapper[4753]: I1005 21:08:57.086622 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7205fdacb925cdaeaf9c91c760626a242ae1d8b4ba56a57a7568ea0520cfff3d"} err="failed to get container status \"7205fdacb925cdaeaf9c91c760626a242ae1d8b4ba56a57a7568ea0520cfff3d\": rpc error: code = NotFound desc = could not find container \"7205fdacb925cdaeaf9c91c760626a242ae1d8b4ba56a57a7568ea0520cfff3d\": container with ID starting with 7205fdacb925cdaeaf9c91c760626a242ae1d8b4ba56a57a7568ea0520cfff3d not found: ID does not exist" Oct 05 21:08:57 crc kubenswrapper[4753]: I1005 21:08:57.086729 4753 scope.go:117] "RemoveContainer" containerID="c3b9cf8728e74e4385303e229c429af475e6887e5a127aa3183e9369cfa5e644" Oct 05 21:08:57 crc kubenswrapper[4753]: E1005 21:08:57.086969 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3b9cf8728e74e4385303e229c429af475e6887e5a127aa3183e9369cfa5e644\": container with ID starting with c3b9cf8728e74e4385303e229c429af475e6887e5a127aa3183e9369cfa5e644 not found: ID does not exist" containerID="c3b9cf8728e74e4385303e229c429af475e6887e5a127aa3183e9369cfa5e644" Oct 05 21:08:57 crc kubenswrapper[4753]: I1005 21:08:57.086999 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3b9cf8728e74e4385303e229c429af475e6887e5a127aa3183e9369cfa5e644"} err="failed to get container status \"c3b9cf8728e74e4385303e229c429af475e6887e5a127aa3183e9369cfa5e644\": rpc error: code = NotFound desc = could not find container \"c3b9cf8728e74e4385303e229c429af475e6887e5a127aa3183e9369cfa5e644\": container with ID starting with c3b9cf8728e74e4385303e229c429af475e6887e5a127aa3183e9369cfa5e644 not found: ID does not exist" Oct 05 21:08:57 crc kubenswrapper[4753]: I1005 21:08:57.860951 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e13df3-1c57-46bb-98be-9628284ae7e6" path="/var/lib/kubelet/pods/a5e13df3-1c57-46bb-98be-9628284ae7e6/volumes" Oct 05 21:09:01 crc kubenswrapper[4753]: I1005 21:09:01.864287 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:09:01 crc kubenswrapper[4753]: E1005 21:09:01.865053 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:09:12 crc kubenswrapper[4753]: I1005 21:09:12.851898 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:09:14 crc kubenswrapper[4753]: I1005 21:09:14.131995 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"8d50785440eb66ddc884cdaacfb40867533221c53067d0d7ab2326a58245174c"} Oct 05 21:09:24 crc kubenswrapper[4753]: I1005 21:09:24.225384 4753 generic.go:334] "Generic (PLEG): container finished" podID="db911cf0-3e57-45a3-a1ce-06f5260745b4" containerID="f0c5e0c9f9d3e32e53868f123671bac711deb814384788d4c81bd7ed4412bf04" exitCode=0 Oct 05 21:09:24 crc kubenswrapper[4753]: I1005 21:09:24.225484 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" event={"ID":"db911cf0-3e57-45a3-a1ce-06f5260745b4","Type":"ContainerDied","Data":"f0c5e0c9f9d3e32e53868f123671bac711deb814384788d4c81bd7ed4412bf04"} Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.677072 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.822196 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-migration-ssh-key-1\") pod \"db911cf0-3e57-45a3-a1ce-06f5260745b4\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.822255 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-cell1-compute-config-0\") pod \"db911cf0-3e57-45a3-a1ce-06f5260745b4\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.822334 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-cell1-compute-config-1\") pod \"db911cf0-3e57-45a3-a1ce-06f5260745b4\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.822355 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-inventory\") pod \"db911cf0-3e57-45a3-a1ce-06f5260745b4\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.822378 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-ceph\") pod \"db911cf0-3e57-45a3-a1ce-06f5260745b4\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.822466 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/db911cf0-3e57-45a3-a1ce-06f5260745b4-ceph-nova-0\") pod \"db911cf0-3e57-45a3-a1ce-06f5260745b4\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.822525 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-ssh-key\") pod \"db911cf0-3e57-45a3-a1ce-06f5260745b4\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.822549 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm62b\" (UniqueName: \"kubernetes.io/projected/db911cf0-3e57-45a3-a1ce-06f5260745b4-kube-api-access-dm62b\") pod \"db911cf0-3e57-45a3-a1ce-06f5260745b4\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.822579 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-custom-ceph-combined-ca-bundle\") pod \"db911cf0-3e57-45a3-a1ce-06f5260745b4\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.822624 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-migration-ssh-key-0\") pod \"db911cf0-3e57-45a3-a1ce-06f5260745b4\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.822649 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-extra-config-0\") pod \"db911cf0-3e57-45a3-a1ce-06f5260745b4\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.837708 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-ceph" (OuterVolumeSpecName: "ceph") pod "db911cf0-3e57-45a3-a1ce-06f5260745b4" (UID: "db911cf0-3e57-45a3-a1ce-06f5260745b4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.840930 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db911cf0-3e57-45a3-a1ce-06f5260745b4-kube-api-access-dm62b" (OuterVolumeSpecName: "kube-api-access-dm62b") pod "db911cf0-3e57-45a3-a1ce-06f5260745b4" (UID: "db911cf0-3e57-45a3-a1ce-06f5260745b4"). InnerVolumeSpecName "kube-api-access-dm62b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.858555 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "db911cf0-3e57-45a3-a1ce-06f5260745b4" (UID: "db911cf0-3e57-45a3-a1ce-06f5260745b4"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.859437 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "db911cf0-3e57-45a3-a1ce-06f5260745b4" (UID: "db911cf0-3e57-45a3-a1ce-06f5260745b4"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.861288 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "db911cf0-3e57-45a3-a1ce-06f5260745b4" (UID: "db911cf0-3e57-45a3-a1ce-06f5260745b4"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.872021 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "db911cf0-3e57-45a3-a1ce-06f5260745b4" (UID: "db911cf0-3e57-45a3-a1ce-06f5260745b4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.877359 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "db911cf0-3e57-45a3-a1ce-06f5260745b4" (UID: "db911cf0-3e57-45a3-a1ce-06f5260745b4"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.879852 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "db911cf0-3e57-45a3-a1ce-06f5260745b4" (UID: "db911cf0-3e57-45a3-a1ce-06f5260745b4"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.882608 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db911cf0-3e57-45a3-a1ce-06f5260745b4-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "db911cf0-3e57-45a3-a1ce-06f5260745b4" (UID: "db911cf0-3e57-45a3-a1ce-06f5260745b4"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:09:25 crc kubenswrapper[4753]: E1005 21:09:25.887027 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-extra-config-0 podName:db911cf0-3e57-45a3-a1ce-06f5260745b4 nodeName:}" failed. No retries permitted until 2025-10-05 21:09:26.387007108 +0000 UTC m=+3275.235335340 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "nova-extra-config-0" (UniqueName: "kubernetes.io/configmap/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-extra-config-0") pod "db911cf0-3e57-45a3-a1ce-06f5260745b4" (UID: "db911cf0-3e57-45a3-a1ce-06f5260745b4") : error deleting /var/lib/kubelet/pods/db911cf0-3e57-45a3-a1ce-06f5260745b4/volume-subpaths: remove /var/lib/kubelet/pods/db911cf0-3e57-45a3-a1ce-06f5260745b4/volume-subpaths: no such file or directory Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.887599 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-inventory" (OuterVolumeSpecName: "inventory") pod "db911cf0-3e57-45a3-a1ce-06f5260745b4" (UID: "db911cf0-3e57-45a3-a1ce-06f5260745b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.924126 4753 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.924167 4753 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.924176 4753 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.924184 4753 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-inventory\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.924192 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.924202 4753 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/db911cf0-3e57-45a3-a1ce-06f5260745b4-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.924210 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.924218 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm62b\" (UniqueName: \"kubernetes.io/projected/db911cf0-3e57-45a3-a1ce-06f5260745b4-kube-api-access-dm62b\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.924227 4753 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:25 crc kubenswrapper[4753]: I1005 21:09:25.924236 4753 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:26 crc kubenswrapper[4753]: I1005 21:09:26.249488 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" event={"ID":"db911cf0-3e57-45a3-a1ce-06f5260745b4","Type":"ContainerDied","Data":"05a1e3a31b81d34110f35d9bbf335347770be88b2a3681c6eca860acb4174ad9"} Oct 05 21:09:26 crc kubenswrapper[4753]: I1005 21:09:26.249824 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05a1e3a31b81d34110f35d9bbf335347770be88b2a3681c6eca860acb4174ad9" Oct 05 21:09:26 crc kubenswrapper[4753]: I1005 21:09:26.249541 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk" Oct 05 21:09:26 crc kubenswrapper[4753]: I1005 21:09:26.433917 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-extra-config-0\") pod \"db911cf0-3e57-45a3-a1ce-06f5260745b4\" (UID: \"db911cf0-3e57-45a3-a1ce-06f5260745b4\") " Oct 05 21:09:26 crc kubenswrapper[4753]: I1005 21:09:26.434714 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "db911cf0-3e57-45a3-a1ce-06f5260745b4" (UID: "db911cf0-3e57-45a3-a1ce-06f5260745b4"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:09:26 crc kubenswrapper[4753]: I1005 21:09:26.536494 4753 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/db911cf0-3e57-45a3-a1ce-06f5260745b4-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.341646 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 05 21:09:41 crc kubenswrapper[4753]: E1005 21:09:41.362018 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e13df3-1c57-46bb-98be-9628284ae7e6" containerName="registry-server" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.362055 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e13df3-1c57-46bb-98be-9628284ae7e6" containerName="registry-server" Oct 05 21:09:41 crc kubenswrapper[4753]: E1005 21:09:41.362087 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e13df3-1c57-46bb-98be-9628284ae7e6" containerName="extract-utilities" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.362098 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e13df3-1c57-46bb-98be-9628284ae7e6" containerName="extract-utilities" Oct 05 21:09:41 crc kubenswrapper[4753]: E1005 21:09:41.362166 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e13df3-1c57-46bb-98be-9628284ae7e6" containerName="extract-content" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.362175 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e13df3-1c57-46bb-98be-9628284ae7e6" containerName="extract-content" Oct 05 21:09:41 crc kubenswrapper[4753]: E1005 21:09:41.362190 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db911cf0-3e57-45a3-a1ce-06f5260745b4" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.362196 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="db911cf0-3e57-45a3-a1ce-06f5260745b4" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.362731 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="db911cf0-3e57-45a3-a1ce-06f5260745b4" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.362761 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e13df3-1c57-46bb-98be-9628284ae7e6" containerName="registry-server" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.364288 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.374580 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.375079 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.386240 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.400603 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.403163 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.410360 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.424165 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.452721 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45a5357e-d55a-4532-aaff-fe090b71fc60-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.452771 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adbbbc89-97ba-492f-a842-c9bf33a69480-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.452796 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-dev\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.452813 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-dev\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.452837 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.452850 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-run\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.452866 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.452884 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.452900 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4hvj\" (UniqueName: \"kubernetes.io/projected/45a5357e-d55a-4532-aaff-fe090b71fc60-kube-api-access-n4hvj\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.452916 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/45a5357e-d55a-4532-aaff-fe090b71fc60-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.452939 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-run\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.452956 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45a5357e-d55a-4532-aaff-fe090b71fc60-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.452975 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/adbbbc89-97ba-492f-a842-c9bf33a69480-ceph\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.452989 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453006 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453035 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml7ph\" (UniqueName: \"kubernetes.io/projected/adbbbc89-97ba-492f-a842-c9bf33a69480-kube-api-access-ml7ph\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453090 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453105 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453125 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453154 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-lib-modules\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453338 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453392 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45a5357e-d55a-4532-aaff-fe090b71fc60-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453424 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453452 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-etc-nvme\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453481 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adbbbc89-97ba-492f-a842-c9bf33a69480-config-data\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453513 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453571 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45a5357e-d55a-4532-aaff-fe090b71fc60-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453606 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adbbbc89-97ba-492f-a842-c9bf33a69480-scripts\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453623 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-sys\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453649 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453707 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-sys\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.453720 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adbbbc89-97ba-492f-a842-c9bf33a69480-config-data-custom\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.554725 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-run\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.554766 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45a5357e-d55a-4532-aaff-fe090b71fc60-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.554788 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/adbbbc89-97ba-492f-a842-c9bf33a69480-ceph\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.554802 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.554817 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.554843 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml7ph\" (UniqueName: \"kubernetes.io/projected/adbbbc89-97ba-492f-a842-c9bf33a69480-kube-api-access-ml7ph\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.554860 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.554875 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.554889 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.554883 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-run\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.554936 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-lib-modules\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.554902 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-lib-modules\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.554978 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.554996 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45a5357e-d55a-4532-aaff-fe090b71fc60-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555012 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555027 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-etc-nvme\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555029 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555045 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adbbbc89-97ba-492f-a842-c9bf33a69480-config-data\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555066 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555103 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45a5357e-d55a-4532-aaff-fe090b71fc60-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555128 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adbbbc89-97ba-492f-a842-c9bf33a69480-scripts\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555159 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-sys\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555180 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555204 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-sys\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555217 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adbbbc89-97ba-492f-a842-c9bf33a69480-config-data-custom\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555240 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45a5357e-d55a-4532-aaff-fe090b71fc60-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555258 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adbbbc89-97ba-492f-a842-c9bf33a69480-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555272 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-dev\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555286 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-dev\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555307 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555321 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-run\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555336 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555355 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555371 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4hvj\" (UniqueName: \"kubernetes.io/projected/45a5357e-d55a-4532-aaff-fe090b71fc60-kube-api-access-n4hvj\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555387 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/45a5357e-d55a-4532-aaff-fe090b71fc60-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.555854 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-sys\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.556134 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-dev\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.556842 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.556845 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.556980 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.556900 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.558198 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-sys\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.558234 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-run\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.558250 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-etc-nvme\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.558279 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.556878 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.558546 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/adbbbc89-97ba-492f-a842-c9bf33a69480-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.558658 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.558743 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.559078 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-dev\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.560332 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/45a5357e-d55a-4532-aaff-fe090b71fc60-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.560391 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.560867 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45a5357e-d55a-4532-aaff-fe090b71fc60-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.560922 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/45a5357e-d55a-4532-aaff-fe090b71fc60-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.561040 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adbbbc89-97ba-492f-a842-c9bf33a69480-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.563940 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45a5357e-d55a-4532-aaff-fe090b71fc60-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.564578 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45a5357e-d55a-4532-aaff-fe090b71fc60-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.565723 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/adbbbc89-97ba-492f-a842-c9bf33a69480-ceph\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.566806 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adbbbc89-97ba-492f-a842-c9bf33a69480-config-data\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.572713 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adbbbc89-97ba-492f-a842-c9bf33a69480-config-data-custom\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.575556 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45a5357e-d55a-4532-aaff-fe090b71fc60-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.575733 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adbbbc89-97ba-492f-a842-c9bf33a69480-scripts\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.577355 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml7ph\" (UniqueName: \"kubernetes.io/projected/adbbbc89-97ba-492f-a842-c9bf33a69480-kube-api-access-ml7ph\") pod \"cinder-backup-0\" (UID: \"adbbbc89-97ba-492f-a842-c9bf33a69480\") " pod="openstack/cinder-backup-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.599005 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4hvj\" (UniqueName: \"kubernetes.io/projected/45a5357e-d55a-4532-aaff-fe090b71fc60-kube-api-access-n4hvj\") pod \"cinder-volume-volume1-0\" (UID: \"45a5357e-d55a-4532-aaff-fe090b71fc60\") " pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.689465 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:41 crc kubenswrapper[4753]: I1005 21:09:41.731270 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.127373 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-g89nn"] Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.128707 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-g89nn" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.144293 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-g89nn"] Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.258191 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.260678 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.265177 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9sx8b" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.265347 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.265502 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.265615 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.274247 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45fk7\" (UniqueName: \"kubernetes.io/projected/9de08ec5-cac4-4f6b-8b34-e63f0d613e00-kube-api-access-45fk7\") pod \"manila-db-create-g89nn\" (UID: \"9de08ec5-cac4-4f6b-8b34-e63f0d613e00\") " pod="openstack/manila-db-create-g89nn" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.279462 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.344830 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.346588 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.351851 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.352027 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.359003 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.375869 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-scripts\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.375953 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7d4226f-3ebe-4569-9124-7394a4ae482d-logs\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.376009 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45fk7\" (UniqueName: \"kubernetes.io/projected/9de08ec5-cac4-4f6b-8b34-e63f0d613e00-kube-api-access-45fk7\") pod \"manila-db-create-g89nn\" (UID: \"9de08ec5-cac4-4f6b-8b34-e63f0d613e00\") " pod="openstack/manila-db-create-g89nn" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.376039 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a7d4226f-3ebe-4569-9124-7394a4ae482d-ceph\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.376073 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.376099 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.381227 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7d4226f-3ebe-4569-9124-7394a4ae482d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.381291 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm4xp\" (UniqueName: \"kubernetes.io/projected/a7d4226f-3ebe-4569-9124-7394a4ae482d-kube-api-access-mm4xp\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.381336 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.381378 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-config-data\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.392619 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-85489f4f6c-b854l"] Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.393985 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.399498 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.399739 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.400303 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-8xls5" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.409009 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.413302 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85489f4f6c-b854l"] Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.458481 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45fk7\" (UniqueName: \"kubernetes.io/projected/9de08ec5-cac4-4f6b-8b34-e63f0d613e00-kube-api-access-45fk7\") pod \"manila-db-create-g89nn\" (UID: \"9de08ec5-cac4-4f6b-8b34-e63f0d613e00\") " pod="openstack/manila-db-create-g89nn" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.485990 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a7d4226f-3ebe-4569-9124-7394a4ae482d-ceph\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486040 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb57t\" (UniqueName: \"kubernetes.io/projected/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-kube-api-access-wb57t\") pod \"horizon-85489f4f6c-b854l\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486068 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbd68b11-473b-45fe-aaa4-1cbd471c8203-logs\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486094 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-config-data\") pod \"horizon-85489f4f6c-b854l\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486113 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-logs\") pod \"horizon-85489f4f6c-b854l\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486167 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486183 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486208 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486310 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486368 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvwrp\" (UniqueName: \"kubernetes.io/projected/cbd68b11-473b-45fe-aaa4-1cbd471c8203-kube-api-access-tvwrp\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486387 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486429 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cbd68b11-473b-45fe-aaa4-1cbd471c8203-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486466 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7d4226f-3ebe-4569-9124-7394a4ae482d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486518 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbd68b11-473b-45fe-aaa4-1cbd471c8203-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486540 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm4xp\" (UniqueName: \"kubernetes.io/projected/a7d4226f-3ebe-4569-9124-7394a4ae482d-kube-api-access-mm4xp\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486561 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486616 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486633 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-scripts\") pod \"horizon-85489f4f6c-b854l\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486684 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-config-data\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486700 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486778 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-scripts\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486795 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7d4226f-3ebe-4569-9124-7394a4ae482d-logs\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.486900 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-horizon-secret-key\") pod \"horizon-85489f4f6c-b854l\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.490095 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.495930 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7d4226f-3ebe-4569-9124-7394a4ae482d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.500314 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7d4226f-3ebe-4569-9124-7394a4ae482d-logs\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.511023 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 05 21:09:42 crc kubenswrapper[4753]: E1005 21:09:42.511692 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-tvwrp logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="cbd68b11-473b-45fe-aaa4-1cbd471c8203" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.518069 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-scripts\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.531623 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-config-data\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.532093 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.532196 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.538598 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a7d4226f-3ebe-4569-9124-7394a4ae482d-ceph\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.562451 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm4xp\" (UniqueName: \"kubernetes.io/projected/a7d4226f-3ebe-4569-9124-7394a4ae482d-kube-api-access-mm4xp\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.569346 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c854c57b9-xprgc"] Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.571316 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.583311 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c854c57b9-xprgc"] Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.596845 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-horizon-secret-key\") pod \"horizon-85489f4f6c-b854l\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.596916 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-horizon-secret-key\") pod \"horizon-7c854c57b9-xprgc\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.596943 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb57t\" (UniqueName: \"kubernetes.io/projected/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-kube-api-access-wb57t\") pod \"horizon-85489f4f6c-b854l\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.596960 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbd68b11-473b-45fe-aaa4-1cbd471c8203-logs\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.596986 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-config-data\") pod \"horizon-85489f4f6c-b854l\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.597003 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-logs\") pod \"horizon-85489f4f6c-b854l\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.597039 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.597091 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.597110 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvwrp\" (UniqueName: \"kubernetes.io/projected/cbd68b11-473b-45fe-aaa4-1cbd471c8203-kube-api-access-tvwrp\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.597126 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.597156 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cbd68b11-473b-45fe-aaa4-1cbd471c8203-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.597181 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd4vn\" (UniqueName: \"kubernetes.io/projected/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-kube-api-access-nd4vn\") pod \"horizon-7c854c57b9-xprgc\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.597206 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-config-data\") pod \"horizon-7c854c57b9-xprgc\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.597225 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbd68b11-473b-45fe-aaa4-1cbd471c8203-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.597250 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.597276 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-scripts\") pod \"horizon-7c854c57b9-xprgc\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.597293 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-scripts\") pod \"horizon-85489f4f6c-b854l\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.597318 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-logs\") pod \"horizon-7c854c57b9-xprgc\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.597336 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.597517 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.605713 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbd68b11-473b-45fe-aaa4-1cbd471c8203-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.618487 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbd68b11-473b-45fe-aaa4-1cbd471c8203-logs\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.619859 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-scripts\") pod \"horizon-85489f4f6c-b854l\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.620373 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-config-data\") pod \"horizon-85489f4f6c-b854l\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.620388 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-logs\") pod \"horizon-85489f4f6c-b854l\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.629842 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-horizon-secret-key\") pod \"horizon-85489f4f6c-b854l\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.631073 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 05 21:09:42 crc kubenswrapper[4753]: E1005 21:09:42.631772 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="a7d4226f-3ebe-4569-9124-7394a4ae482d" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.635734 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cbd68b11-473b-45fe-aaa4-1cbd471c8203-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.637167 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.644466 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.659242 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvwrp\" (UniqueName: \"kubernetes.io/projected/cbd68b11-473b-45fe-aaa4-1cbd471c8203-kube-api-access-tvwrp\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.661212 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.661645 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.663312 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.681555 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.683740 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb57t\" (UniqueName: \"kubernetes.io/projected/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-kube-api-access-wb57t\") pod \"horizon-85489f4f6c-b854l\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.722499 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.726882 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-horizon-secret-key\") pod \"horizon-7c854c57b9-xprgc\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.729448 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd4vn\" (UniqueName: \"kubernetes.io/projected/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-kube-api-access-nd4vn\") pod \"horizon-7c854c57b9-xprgc\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.729502 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-config-data\") pod \"horizon-7c854c57b9-xprgc\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.729564 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-scripts\") pod \"horizon-7c854c57b9-xprgc\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.729603 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-logs\") pod \"horizon-7c854c57b9-xprgc\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.730020 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-logs\") pod \"horizon-7c854c57b9-xprgc\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.735241 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-scripts\") pod \"horizon-7c854c57b9-xprgc\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.735862 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-config-data\") pod \"horizon-7c854c57b9-xprgc\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.749224 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-g89nn" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.755115 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.757551 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-horizon-secret-key\") pod \"horizon-7c854c57b9-xprgc\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.758439 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd4vn\" (UniqueName: \"kubernetes.io/projected/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-kube-api-access-nd4vn\") pod \"horizon-7c854c57b9-xprgc\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.771160 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:42 crc kubenswrapper[4753]: I1005 21:09:42.801787 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.252734 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-g89nn"] Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.330218 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85489f4f6c-b854l"] Oct 05 21:09:43 crc kubenswrapper[4753]: W1005 21:09:43.355075 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d78ff70_0c68_4694_8b5b_c07b2e0c207f.slice/crio-6c6cc5aae59a503a47d225f94dd5c64d5c7aca8176a26ddf07281f349b1dc12b WatchSource:0}: Error finding container 6c6cc5aae59a503a47d225f94dd5c64d5c7aca8176a26ddf07281f349b1dc12b: Status 404 returned error can't find the container with id 6c6cc5aae59a503a47d225f94dd5c64d5c7aca8176a26ddf07281f349b1dc12b Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.415029 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c854c57b9-xprgc"] Oct 05 21:09:43 crc kubenswrapper[4753]: W1005 21:09:43.417806 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c8dd0cb_cdf9_49ae_91eb_4dd161894ee0.slice/crio-8e3e9a51b2ddc2a6f56137e08114e2228fb63862167cc7b4ead8152204d63ab2 WatchSource:0}: Error finding container 8e3e9a51b2ddc2a6f56137e08114e2228fb63862167cc7b4ead8152204d63ab2: Status 404 returned error can't find the container with id 8e3e9a51b2ddc2a6f56137e08114e2228fb63862167cc7b4ead8152204d63ab2 Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.430072 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"adbbbc89-97ba-492f-a842-c9bf33a69480","Type":"ContainerStarted","Data":"2c3f8be08dba956834f71427fcd902528f7f37a099924606fda0768067e69e19"} Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.431567 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"45a5357e-d55a-4532-aaff-fe090b71fc60","Type":"ContainerStarted","Data":"d1cd27b5f738600ecfc6492966e4fd0a297854da2e31817e0b5cfb013879697b"} Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.433362 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85489f4f6c-b854l" event={"ID":"8d78ff70-0c68-4694-8b5b-c07b2e0c207f","Type":"ContainerStarted","Data":"6c6cc5aae59a503a47d225f94dd5c64d5c7aca8176a26ddf07281f349b1dc12b"} Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.436781 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c854c57b9-xprgc" event={"ID":"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0","Type":"ContainerStarted","Data":"8e3e9a51b2ddc2a6f56137e08114e2228fb63862167cc7b4ead8152204d63ab2"} Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.441077 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-g89nn" event={"ID":"9de08ec5-cac4-4f6b-8b34-e63f0d613e00","Type":"ContainerStarted","Data":"12f6da285bc82522a8904649780d499933eeeab26b4f9dd5ae473a980c69b17c"} Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.441128 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.441157 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.453500 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.461224 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.644733 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvwrp\" (UniqueName: \"kubernetes.io/projected/cbd68b11-473b-45fe-aaa4-1cbd471c8203-kube-api-access-tvwrp\") pod \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.644790 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-config-data\") pod \"a7d4226f-3ebe-4569-9124-7394a4ae482d\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.644810 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-config-data\") pod \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.644861 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-public-tls-certs\") pod \"a7d4226f-3ebe-4569-9124-7394a4ae482d\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.644890 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-scripts\") pod \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.644913 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"a7d4226f-3ebe-4569-9124-7394a4ae482d\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.644930 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cbd68b11-473b-45fe-aaa4-1cbd471c8203-ceph\") pod \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.644943 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-combined-ca-bundle\") pod \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.644974 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7d4226f-3ebe-4569-9124-7394a4ae482d-httpd-run\") pod \"a7d4226f-3ebe-4569-9124-7394a4ae482d\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.644999 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a7d4226f-3ebe-4569-9124-7394a4ae482d-ceph\") pod \"a7d4226f-3ebe-4569-9124-7394a4ae482d\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.645017 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-internal-tls-certs\") pod \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.645036 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7d4226f-3ebe-4569-9124-7394a4ae482d-logs\") pod \"a7d4226f-3ebe-4569-9124-7394a4ae482d\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.645061 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-scripts\") pod \"a7d4226f-3ebe-4569-9124-7394a4ae482d\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.645089 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-combined-ca-bundle\") pod \"a7d4226f-3ebe-4569-9124-7394a4ae482d\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.645134 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm4xp\" (UniqueName: \"kubernetes.io/projected/a7d4226f-3ebe-4569-9124-7394a4ae482d-kube-api-access-mm4xp\") pod \"a7d4226f-3ebe-4569-9124-7394a4ae482d\" (UID: \"a7d4226f-3ebe-4569-9124-7394a4ae482d\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.645220 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbd68b11-473b-45fe-aaa4-1cbd471c8203-httpd-run\") pod \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.645249 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.645285 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbd68b11-473b-45fe-aaa4-1cbd471c8203-logs\") pod \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\" (UID: \"cbd68b11-473b-45fe-aaa4-1cbd471c8203\") " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.645483 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7d4226f-3ebe-4569-9124-7394a4ae482d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a7d4226f-3ebe-4569-9124-7394a4ae482d" (UID: "a7d4226f-3ebe-4569-9124-7394a4ae482d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.645736 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbd68b11-473b-45fe-aaa4-1cbd471c8203-logs" (OuterVolumeSpecName: "logs") pod "cbd68b11-473b-45fe-aaa4-1cbd471c8203" (UID: "cbd68b11-473b-45fe-aaa4-1cbd471c8203"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.646171 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbd68b11-473b-45fe-aaa4-1cbd471c8203-logs\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.646192 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7d4226f-3ebe-4569-9124-7394a4ae482d-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.647121 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7d4226f-3ebe-4569-9124-7394a4ae482d-logs" (OuterVolumeSpecName: "logs") pod "a7d4226f-3ebe-4569-9124-7394a4ae482d" (UID: "a7d4226f-3ebe-4569-9124-7394a4ae482d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.647363 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbd68b11-473b-45fe-aaa4-1cbd471c8203-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cbd68b11-473b-45fe-aaa4-1cbd471c8203" (UID: "cbd68b11-473b-45fe-aaa4-1cbd471c8203"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.655963 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-scripts" (OuterVolumeSpecName: "scripts") pod "a7d4226f-3ebe-4569-9124-7394a4ae482d" (UID: "a7d4226f-3ebe-4569-9124-7394a4ae482d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.657836 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7d4226f-3ebe-4569-9124-7394a4ae482d" (UID: "a7d4226f-3ebe-4569-9124-7394a4ae482d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.658395 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-config-data" (OuterVolumeSpecName: "config-data") pod "a7d4226f-3ebe-4569-9124-7394a4ae482d" (UID: "a7d4226f-3ebe-4569-9124-7394a4ae482d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.658458 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "cbd68b11-473b-45fe-aaa4-1cbd471c8203" (UID: "cbd68b11-473b-45fe-aaa4-1cbd471c8203"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.658451 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbd68b11-473b-45fe-aaa4-1cbd471c8203-kube-api-access-tvwrp" (OuterVolumeSpecName: "kube-api-access-tvwrp") pod "cbd68b11-473b-45fe-aaa4-1cbd471c8203" (UID: "cbd68b11-473b-45fe-aaa4-1cbd471c8203"). InnerVolumeSpecName "kube-api-access-tvwrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.658504 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a7d4226f-3ebe-4569-9124-7394a4ae482d" (UID: "a7d4226f-3ebe-4569-9124-7394a4ae482d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.658712 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "a7d4226f-3ebe-4569-9124-7394a4ae482d" (UID: "a7d4226f-3ebe-4569-9124-7394a4ae482d"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.661456 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-scripts" (OuterVolumeSpecName: "scripts") pod "cbd68b11-473b-45fe-aaa4-1cbd471c8203" (UID: "cbd68b11-473b-45fe-aaa4-1cbd471c8203"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.666124 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cbd68b11-473b-45fe-aaa4-1cbd471c8203" (UID: "cbd68b11-473b-45fe-aaa4-1cbd471c8203"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.666364 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbd68b11-473b-45fe-aaa4-1cbd471c8203-ceph" (OuterVolumeSpecName: "ceph") pod "cbd68b11-473b-45fe-aaa4-1cbd471c8203" (UID: "cbd68b11-473b-45fe-aaa4-1cbd471c8203"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.666385 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d4226f-3ebe-4569-9124-7394a4ae482d-kube-api-access-mm4xp" (OuterVolumeSpecName: "kube-api-access-mm4xp") pod "a7d4226f-3ebe-4569-9124-7394a4ae482d" (UID: "a7d4226f-3ebe-4569-9124-7394a4ae482d"). InnerVolumeSpecName "kube-api-access-mm4xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.667233 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-config-data" (OuterVolumeSpecName: "config-data") pod "cbd68b11-473b-45fe-aaa4-1cbd471c8203" (UID: "cbd68b11-473b-45fe-aaa4-1cbd471c8203"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.668245 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbd68b11-473b-45fe-aaa4-1cbd471c8203" (UID: "cbd68b11-473b-45fe-aaa4-1cbd471c8203"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.676453 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d4226f-3ebe-4569-9124-7394a4ae482d-ceph" (OuterVolumeSpecName: "ceph") pod "a7d4226f-3ebe-4569-9124-7394a4ae482d" (UID: "a7d4226f-3ebe-4569-9124-7394a4ae482d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.747416 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvwrp\" (UniqueName: \"kubernetes.io/projected/cbd68b11-473b-45fe-aaa4-1cbd471c8203-kube-api-access-tvwrp\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.747453 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.747464 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.747474 4753 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.747484 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.747523 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.747534 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cbd68b11-473b-45fe-aaa4-1cbd471c8203-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.747543 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.747552 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a7d4226f-3ebe-4569-9124-7394a4ae482d-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.747559 4753 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd68b11-473b-45fe-aaa4-1cbd471c8203-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.747568 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7d4226f-3ebe-4569-9124-7394a4ae482d-logs\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.747576 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.747584 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d4226f-3ebe-4569-9124-7394a4ae482d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.747592 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm4xp\" (UniqueName: \"kubernetes.io/projected/a7d4226f-3ebe-4569-9124-7394a4ae482d-kube-api-access-mm4xp\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.747600 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbd68b11-473b-45fe-aaa4-1cbd471c8203-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.747623 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.768308 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.771319 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.850349 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:43 crc kubenswrapper[4753]: I1005 21:09:43.850375 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.457271 4753 generic.go:334] "Generic (PLEG): container finished" podID="9de08ec5-cac4-4f6b-8b34-e63f0d613e00" containerID="4a127b0b22bf63d7a7f83d1320a634418650643cde7be0da90306e706633e7d1" exitCode=0 Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.457355 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.457869 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-g89nn" event={"ID":"9de08ec5-cac4-4f6b-8b34-e63f0d613e00","Type":"ContainerDied","Data":"4a127b0b22bf63d7a7f83d1320a634418650643cde7be0da90306e706633e7d1"} Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.457953 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.493175 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c854c57b9-xprgc"] Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.642369 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-55b7f7b494-jrzrq"] Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.644572 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.662061 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.694117 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55b7f7b494-jrzrq"] Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.727787 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.748270 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.766368 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-config-data\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.767133 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-combined-ca-bundle\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.767314 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7qjv\" (UniqueName: \"kubernetes.io/projected/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-kube-api-access-n7qjv\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.767394 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-horizon-secret-key\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.767494 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-logs\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.767598 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-scripts\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.767680 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-horizon-tls-certs\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.786376 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-85489f4f6c-b854l"] Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.806009 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.807486 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.827003 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.827195 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.827315 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9sx8b" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.827423 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.835201 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.868731 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-horizon-tls-certs\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.868794 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-config-data\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.868836 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-combined-ca-bundle\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.868903 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7qjv\" (UniqueName: \"kubernetes.io/projected/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-kube-api-access-n7qjv\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.868926 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-horizon-secret-key\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.868952 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-logs\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.868984 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-scripts\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.869828 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-scripts\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.871730 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-config-data\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.874863 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-logs\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.874912 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.882854 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-horizon-secret-key\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.903831 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-combined-ca-bundle\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.909515 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.911819 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-horizon-tls-certs\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.958290 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-745b9fcf5d-xkxjq"] Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.960447 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.971867 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4606d5be-d97d-4c1b-95df-1aad021ced17-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.971903 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z78h6\" (UniqueName: \"kubernetes.io/projected/4606d5be-d97d-4c1b-95df-1aad021ced17-kube-api-access-z78h6\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.971966 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4606d5be-d97d-4c1b-95df-1aad021ced17-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.972002 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.972037 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4606d5be-d97d-4c1b-95df-1aad021ced17-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.972078 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4606d5be-d97d-4c1b-95df-1aad021ced17-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.972107 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4606d5be-d97d-4c1b-95df-1aad021ced17-logs\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.972159 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4606d5be-d97d-4c1b-95df-1aad021ced17-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.972181 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4606d5be-d97d-4c1b-95df-1aad021ced17-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:44 crc kubenswrapper[4753]: I1005 21:09:44.972316 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:44.998042 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.008779 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.018936 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.092743 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-745b9fcf5d-xkxjq"] Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.121877 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.121953 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75e67d63-2798-48de-af86-f8334eecdc26-logs\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.121979 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1309d62-7702-49bc-892f-705d8ac9fff3-config-data\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.122013 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1309d62-7702-49bc-892f-705d8ac9fff3-horizon-tls-certs\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.122034 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-scripts\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.122063 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4606d5be-d97d-4c1b-95df-1aad021ced17-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.122086 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z78h6\" (UniqueName: \"kubernetes.io/projected/4606d5be-d97d-4c1b-95df-1aad021ced17-kube-api-access-z78h6\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.122120 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndrhv\" (UniqueName: \"kubernetes.io/projected/75e67d63-2798-48de-af86-f8334eecdc26-kube-api-access-ndrhv\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.130081 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e1309d62-7702-49bc-892f-705d8ac9fff3-horizon-secret-key\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.130760 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7qjv\" (UniqueName: \"kubernetes.io/projected/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-kube-api-access-n7qjv\") pod \"horizon-55b7f7b494-jrzrq\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.130911 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/75e67d63-2798-48de-af86-f8334eecdc26-ceph\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.130970 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4606d5be-d97d-4c1b-95df-1aad021ced17-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.131008 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.131034 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.131082 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1309d62-7702-49bc-892f-705d8ac9fff3-scripts\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.131101 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4606d5be-d97d-4c1b-95df-1aad021ced17-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.131132 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1309d62-7702-49bc-892f-705d8ac9fff3-combined-ca-bundle\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.131190 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75e67d63-2798-48de-af86-f8334eecdc26-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.131208 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4606d5be-d97d-4c1b-95df-1aad021ced17-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.131246 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-config-data\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.131290 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4606d5be-d97d-4c1b-95df-1aad021ced17-logs\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.131324 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.131361 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwrg4\" (UniqueName: \"kubernetes.io/projected/e1309d62-7702-49bc-892f-705d8ac9fff3-kube-api-access-qwrg4\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.131411 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4606d5be-d97d-4c1b-95df-1aad021ced17-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.131445 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1309d62-7702-49bc-892f-705d8ac9fff3-logs\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.131748 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4606d5be-d97d-4c1b-95df-1aad021ced17-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.131766 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.132271 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4606d5be-d97d-4c1b-95df-1aad021ced17-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.135796 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4606d5be-d97d-4c1b-95df-1aad021ced17-logs\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.162288 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.213663 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.239754 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4606d5be-d97d-4c1b-95df-1aad021ced17-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.239791 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4606d5be-d97d-4c1b-95df-1aad021ced17-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.240780 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4606d5be-d97d-4c1b-95df-1aad021ced17-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.244687 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4606d5be-d97d-4c1b-95df-1aad021ced17-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.247978 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4606d5be-d97d-4c1b-95df-1aad021ced17-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.252180 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z78h6\" (UniqueName: \"kubernetes.io/projected/4606d5be-d97d-4c1b-95df-1aad021ced17-kube-api-access-z78h6\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.253518 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.253563 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwrg4\" (UniqueName: \"kubernetes.io/projected/e1309d62-7702-49bc-892f-705d8ac9fff3-kube-api-access-qwrg4\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.253601 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1309d62-7702-49bc-892f-705d8ac9fff3-logs\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.253642 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.253673 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75e67d63-2798-48de-af86-f8334eecdc26-logs\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.253690 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1309d62-7702-49bc-892f-705d8ac9fff3-config-data\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.253709 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1309d62-7702-49bc-892f-705d8ac9fff3-horizon-tls-certs\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.253726 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-scripts\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.253751 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndrhv\" (UniqueName: \"kubernetes.io/projected/75e67d63-2798-48de-af86-f8334eecdc26-kube-api-access-ndrhv\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.253771 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e1309d62-7702-49bc-892f-705d8ac9fff3-horizon-secret-key\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.253799 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/75e67d63-2798-48de-af86-f8334eecdc26-ceph\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.253827 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.253848 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1309d62-7702-49bc-892f-705d8ac9fff3-scripts\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.253867 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1309d62-7702-49bc-892f-705d8ac9fff3-combined-ca-bundle\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.253887 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75e67d63-2798-48de-af86-f8334eecdc26-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.253912 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-config-data\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.256728 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1309d62-7702-49bc-892f-705d8ac9fff3-logs\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.267049 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1309d62-7702-49bc-892f-705d8ac9fff3-horizon-tls-certs\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.267184 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.268004 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75e67d63-2798-48de-af86-f8334eecdc26-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.268769 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1309d62-7702-49bc-892f-705d8ac9fff3-scripts\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.268843 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-config-data\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.271332 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75e67d63-2798-48de-af86-f8334eecdc26-logs\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.273856 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1309d62-7702-49bc-892f-705d8ac9fff3-config-data\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.277735 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/75e67d63-2798-48de-af86-f8334eecdc26-ceph\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.282922 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.287701 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.287818 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndrhv\" (UniqueName: \"kubernetes.io/projected/75e67d63-2798-48de-af86-f8334eecdc26-kube-api-access-ndrhv\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.288029 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e1309d62-7702-49bc-892f-705d8ac9fff3-horizon-secret-key\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.292613 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-scripts\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.306878 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1309d62-7702-49bc-892f-705d8ac9fff3-combined-ca-bundle\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.314638 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.332157 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwrg4\" (UniqueName: \"kubernetes.io/projected/e1309d62-7702-49bc-892f-705d8ac9fff3-kube-api-access-qwrg4\") pod \"horizon-745b9fcf5d-xkxjq\" (UID: \"e1309d62-7702-49bc-892f-705d8ac9fff3\") " pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.341120 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4606d5be-d97d-4c1b-95df-1aad021ced17\") " pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.349746 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.409602 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.426839 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:09:45 crc kubenswrapper[4753]: I1005 21:09:45.439761 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 05 21:09:46 crc kubenswrapper[4753]: I1005 21:09:45.892232 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d4226f-3ebe-4569-9124-7394a4ae482d" path="/var/lib/kubelet/pods/a7d4226f-3ebe-4569-9124-7394a4ae482d/volumes" Oct 05 21:09:46 crc kubenswrapper[4753]: I1005 21:09:45.893812 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbd68b11-473b-45fe-aaa4-1cbd471c8203" path="/var/lib/kubelet/pods/cbd68b11-473b-45fe-aaa4-1cbd471c8203/volumes" Oct 05 21:09:46 crc kubenswrapper[4753]: I1005 21:09:45.925438 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-g89nn" Oct 05 21:09:46 crc kubenswrapper[4753]: I1005 21:09:46.071872 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45fk7\" (UniqueName: \"kubernetes.io/projected/9de08ec5-cac4-4f6b-8b34-e63f0d613e00-kube-api-access-45fk7\") pod \"9de08ec5-cac4-4f6b-8b34-e63f0d613e00\" (UID: \"9de08ec5-cac4-4f6b-8b34-e63f0d613e00\") " Oct 05 21:09:46 crc kubenswrapper[4753]: I1005 21:09:46.078589 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9de08ec5-cac4-4f6b-8b34-e63f0d613e00-kube-api-access-45fk7" (OuterVolumeSpecName: "kube-api-access-45fk7") pod "9de08ec5-cac4-4f6b-8b34-e63f0d613e00" (UID: "9de08ec5-cac4-4f6b-8b34-e63f0d613e00"). InnerVolumeSpecName "kube-api-access-45fk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:09:46 crc kubenswrapper[4753]: I1005 21:09:46.174525 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45fk7\" (UniqueName: \"kubernetes.io/projected/9de08ec5-cac4-4f6b-8b34-e63f0d613e00-kube-api-access-45fk7\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:46 crc kubenswrapper[4753]: I1005 21:09:46.516557 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"45a5357e-d55a-4532-aaff-fe090b71fc60","Type":"ContainerStarted","Data":"18ddebc05204debd9f54fff8a1b8422fe7214ff1bba0e421a2f4a32cda81a47b"} Oct 05 21:09:46 crc kubenswrapper[4753]: I1005 21:09:46.520291 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"45a5357e-d55a-4532-aaff-fe090b71fc60","Type":"ContainerStarted","Data":"beb887830fcdfb71dfe4cb25e3cdd7c4ea22fb24ef4b824c85b02ea7782346be"} Oct 05 21:09:46 crc kubenswrapper[4753]: I1005 21:09:46.520310 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-g89nn" event={"ID":"9de08ec5-cac4-4f6b-8b34-e63f0d613e00","Type":"ContainerDied","Data":"12f6da285bc82522a8904649780d499933eeeab26b4f9dd5ae473a980c69b17c"} Oct 05 21:09:46 crc kubenswrapper[4753]: I1005 21:09:46.520323 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12f6da285bc82522a8904649780d499933eeeab26b4f9dd5ae473a980c69b17c" Oct 05 21:09:46 crc kubenswrapper[4753]: I1005 21:09:46.520332 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"adbbbc89-97ba-492f-a842-c9bf33a69480","Type":"ContainerStarted","Data":"07d94f6217e5688cb5f0185aff16cee00edc6eebaaf775af8cc862f35932a9a2"} Oct 05 21:09:46 crc kubenswrapper[4753]: I1005 21:09:46.520342 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"adbbbc89-97ba-492f-a842-c9bf33a69480","Type":"ContainerStarted","Data":"572d78db7fe66ad178acea8309c5005d1b33b17efd204addbe8a3b0408e656b2"} Oct 05 21:09:46 crc kubenswrapper[4753]: I1005 21:09:46.518654 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-g89nn" Oct 05 21:09:47 crc kubenswrapper[4753]: I1005 21:09:47.262866 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55b7f7b494-jrzrq"] Oct 05 21:09:47 crc kubenswrapper[4753]: I1005 21:09:47.363473 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-745b9fcf5d-xkxjq"] Oct 05 21:09:47 crc kubenswrapper[4753]: I1005 21:09:47.444910 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 05 21:09:47 crc kubenswrapper[4753]: W1005 21:09:47.452233 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75e67d63_2798_48de_af86_f8334eecdc26.slice/crio-1e673deed5e21fc7eb257514bff002e6d5a4d552be1856141d90ad6b8d25acf9 WatchSource:0}: Error finding container 1e673deed5e21fc7eb257514bff002e6d5a4d552be1856141d90ad6b8d25acf9: Status 404 returned error can't find the container with id 1e673deed5e21fc7eb257514bff002e6d5a4d552be1856141d90ad6b8d25acf9 Oct 05 21:09:47 crc kubenswrapper[4753]: I1005 21:09:47.535041 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-745b9fcf5d-xkxjq" event={"ID":"e1309d62-7702-49bc-892f-705d8ac9fff3","Type":"ContainerStarted","Data":"274e5cfc0628762783a16145f118fdab9a27dbe62fa06c0a90416a276d7cbf25"} Oct 05 21:09:47 crc kubenswrapper[4753]: I1005 21:09:47.548419 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"75e67d63-2798-48de-af86-f8334eecdc26","Type":"ContainerStarted","Data":"1e673deed5e21fc7eb257514bff002e6d5a4d552be1856141d90ad6b8d25acf9"} Oct 05 21:09:47 crc kubenswrapper[4753]: I1005 21:09:47.556880 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55b7f7b494-jrzrq" event={"ID":"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac","Type":"ContainerStarted","Data":"0e3d8faf8c3de3067cce948a0afcac160c6dad86f73786c528635e84403b916d"} Oct 05 21:09:47 crc kubenswrapper[4753]: W1005 21:09:47.570404 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4606d5be_d97d_4c1b_95df_1aad021ced17.slice/crio-b7ba56fba4e1fedb9ddd0f577208f9af0cba0f1f42c885ae787620def20a57a2 WatchSource:0}: Error finding container b7ba56fba4e1fedb9ddd0f577208f9af0cba0f1f42c885ae787620def20a57a2: Status 404 returned error can't find the container with id b7ba56fba4e1fedb9ddd0f577208f9af0cba0f1f42c885ae787620def20a57a2 Oct 05 21:09:47 crc kubenswrapper[4753]: I1005 21:09:47.577013 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 05 21:09:47 crc kubenswrapper[4753]: I1005 21:09:47.603742 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.378264675 podStartE2EDuration="6.603723329s" podCreationTimestamp="2025-10-05 21:09:41 +0000 UTC" firstStartedPulling="2025-10-05 21:09:42.58726791 +0000 UTC m=+3291.435596142" lastFinishedPulling="2025-10-05 21:09:45.812726564 +0000 UTC m=+3294.661054796" observedRunningTime="2025-10-05 21:09:47.600774278 +0000 UTC m=+3296.449102510" watchObservedRunningTime="2025-10-05 21:09:47.603723329 +0000 UTC m=+3296.452051561" Oct 05 21:09:47 crc kubenswrapper[4753]: I1005 21:09:47.647195 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=4.87651334 podStartE2EDuration="6.647177939s" podCreationTimestamp="2025-10-05 21:09:41 +0000 UTC" firstStartedPulling="2025-10-05 21:09:42.714396561 +0000 UTC m=+3291.562724793" lastFinishedPulling="2025-10-05 21:09:44.48506116 +0000 UTC m=+3293.333389392" observedRunningTime="2025-10-05 21:09:47.64624446 +0000 UTC m=+3296.494572712" watchObservedRunningTime="2025-10-05 21:09:47.647177939 +0000 UTC m=+3296.495506161" Oct 05 21:09:48 crc kubenswrapper[4753]: I1005 21:09:48.570601 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4606d5be-d97d-4c1b-95df-1aad021ced17","Type":"ContainerStarted","Data":"b7ba56fba4e1fedb9ddd0f577208f9af0cba0f1f42c885ae787620def20a57a2"} Oct 05 21:09:49 crc kubenswrapper[4753]: I1005 21:09:49.603491 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"75e67d63-2798-48de-af86-f8334eecdc26","Type":"ContainerStarted","Data":"9ac66f7ebc83eec353e8bd2083f4f5b6e1681611e0308c8ece73e438ea1fe1e9"} Oct 05 21:09:49 crc kubenswrapper[4753]: I1005 21:09:49.611426 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4606d5be-d97d-4c1b-95df-1aad021ced17","Type":"ContainerStarted","Data":"fe7e2f46d07d030eafb8c3fa20407d755f2110b33234e29206787ca4cd3a34f1"} Oct 05 21:09:50 crc kubenswrapper[4753]: I1005 21:09:50.623062 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"75e67d63-2798-48de-af86-f8334eecdc26","Type":"ContainerStarted","Data":"18cc3dfea9472fc53c0271ef46c93831db009d1ad41ce446e4aa934a4dff9e37"} Oct 05 21:09:50 crc kubenswrapper[4753]: I1005 21:09:50.623228 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="75e67d63-2798-48de-af86-f8334eecdc26" containerName="glance-log" containerID="cri-o://9ac66f7ebc83eec353e8bd2083f4f5b6e1681611e0308c8ece73e438ea1fe1e9" gracePeriod=30 Oct 05 21:09:50 crc kubenswrapper[4753]: I1005 21:09:50.623250 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="75e67d63-2798-48de-af86-f8334eecdc26" containerName="glance-httpd" containerID="cri-o://18cc3dfea9472fc53c0271ef46c93831db009d1ad41ce446e4aa934a4dff9e37" gracePeriod=30 Oct 05 21:09:50 crc kubenswrapper[4753]: I1005 21:09:50.628170 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4606d5be-d97d-4c1b-95df-1aad021ced17","Type":"ContainerStarted","Data":"11f3e988c3897923551911ac2e640a343b3ee5d79a2bc979186cd8b37c25c51b"} Oct 05 21:09:50 crc kubenswrapper[4753]: I1005 21:09:50.645118 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.645099364 podStartE2EDuration="6.645099364s" podCreationTimestamp="2025-10-05 21:09:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 21:09:50.643792953 +0000 UTC m=+3299.492121195" watchObservedRunningTime="2025-10-05 21:09:50.645099364 +0000 UTC m=+3299.493427596" Oct 05 21:09:50 crc kubenswrapper[4753]: I1005 21:09:50.681476 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.681457805 podStartE2EDuration="6.681457805s" podCreationTimestamp="2025-10-05 21:09:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 21:09:50.673589243 +0000 UTC m=+3299.521917485" watchObservedRunningTime="2025-10-05 21:09:50.681457805 +0000 UTC m=+3299.529786037" Oct 05 21:09:51 crc kubenswrapper[4753]: I1005 21:09:51.642870 4753 generic.go:334] "Generic (PLEG): container finished" podID="75e67d63-2798-48de-af86-f8334eecdc26" containerID="18cc3dfea9472fc53c0271ef46c93831db009d1ad41ce446e4aa934a4dff9e37" exitCode=0 Oct 05 21:09:51 crc kubenswrapper[4753]: I1005 21:09:51.642910 4753 generic.go:334] "Generic (PLEG): container finished" podID="75e67d63-2798-48de-af86-f8334eecdc26" containerID="9ac66f7ebc83eec353e8bd2083f4f5b6e1681611e0308c8ece73e438ea1fe1e9" exitCode=143 Oct 05 21:09:51 crc kubenswrapper[4753]: I1005 21:09:51.642919 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"75e67d63-2798-48de-af86-f8334eecdc26","Type":"ContainerDied","Data":"18cc3dfea9472fc53c0271ef46c93831db009d1ad41ce446e4aa934a4dff9e37"} Oct 05 21:09:51 crc kubenswrapper[4753]: I1005 21:09:51.643000 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"75e67d63-2798-48de-af86-f8334eecdc26","Type":"ContainerDied","Data":"9ac66f7ebc83eec353e8bd2083f4f5b6e1681611e0308c8ece73e438ea1fe1e9"} Oct 05 21:09:51 crc kubenswrapper[4753]: I1005 21:09:51.690413 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:51 crc kubenswrapper[4753]: I1005 21:09:51.732624 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 05 21:09:51 crc kubenswrapper[4753]: I1005 21:09:51.941672 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 05 21:09:51 crc kubenswrapper[4753]: I1005 21:09:51.998971 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 05 21:09:52 crc kubenswrapper[4753]: I1005 21:09:52.206217 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-b5a7-account-create-k477x"] Oct 05 21:09:52 crc kubenswrapper[4753]: E1005 21:09:52.206976 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de08ec5-cac4-4f6b-8b34-e63f0d613e00" containerName="mariadb-database-create" Oct 05 21:09:52 crc kubenswrapper[4753]: I1005 21:09:52.207016 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de08ec5-cac4-4f6b-8b34-e63f0d613e00" containerName="mariadb-database-create" Oct 05 21:09:52 crc kubenswrapper[4753]: I1005 21:09:52.207273 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="9de08ec5-cac4-4f6b-8b34-e63f0d613e00" containerName="mariadb-database-create" Oct 05 21:09:52 crc kubenswrapper[4753]: I1005 21:09:52.208012 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b5a7-account-create-k477x" Oct 05 21:09:52 crc kubenswrapper[4753]: I1005 21:09:52.209836 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 05 21:09:52 crc kubenswrapper[4753]: I1005 21:09:52.240020 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-b5a7-account-create-k477x"] Oct 05 21:09:52 crc kubenswrapper[4753]: I1005 21:09:52.314871 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcgj5\" (UniqueName: \"kubernetes.io/projected/5bf632a4-72e3-4567-b353-bf19ee21c255-kube-api-access-dcgj5\") pod \"manila-b5a7-account-create-k477x\" (UID: \"5bf632a4-72e3-4567-b353-bf19ee21c255\") " pod="openstack/manila-b5a7-account-create-k477x" Oct 05 21:09:52 crc kubenswrapper[4753]: I1005 21:09:52.417327 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcgj5\" (UniqueName: \"kubernetes.io/projected/5bf632a4-72e3-4567-b353-bf19ee21c255-kube-api-access-dcgj5\") pod \"manila-b5a7-account-create-k477x\" (UID: \"5bf632a4-72e3-4567-b353-bf19ee21c255\") " pod="openstack/manila-b5a7-account-create-k477x" Oct 05 21:09:52 crc kubenswrapper[4753]: I1005 21:09:52.444551 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcgj5\" (UniqueName: \"kubernetes.io/projected/5bf632a4-72e3-4567-b353-bf19ee21c255-kube-api-access-dcgj5\") pod \"manila-b5a7-account-create-k477x\" (UID: \"5bf632a4-72e3-4567-b353-bf19ee21c255\") " pod="openstack/manila-b5a7-account-create-k477x" Oct 05 21:09:52 crc kubenswrapper[4753]: I1005 21:09:52.540860 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b5a7-account-create-k477x" Oct 05 21:09:55 crc kubenswrapper[4753]: I1005 21:09:55.410180 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 05 21:09:55 crc kubenswrapper[4753]: I1005 21:09:55.410494 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 05 21:09:55 crc kubenswrapper[4753]: I1005 21:09:55.538843 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 05 21:09:55 crc kubenswrapper[4753]: I1005 21:09:55.540064 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 05 21:09:55 crc kubenswrapper[4753]: I1005 21:09:55.678413 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 05 21:09:55 crc kubenswrapper[4753]: I1005 21:09:55.679555 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.800207 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.909912 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-config-data\") pod \"75e67d63-2798-48de-af86-f8334eecdc26\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.910310 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75e67d63-2798-48de-af86-f8334eecdc26-httpd-run\") pod \"75e67d63-2798-48de-af86-f8334eecdc26\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.910370 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndrhv\" (UniqueName: \"kubernetes.io/projected/75e67d63-2798-48de-af86-f8334eecdc26-kube-api-access-ndrhv\") pod \"75e67d63-2798-48de-af86-f8334eecdc26\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.910434 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/75e67d63-2798-48de-af86-f8334eecdc26-ceph\") pod \"75e67d63-2798-48de-af86-f8334eecdc26\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.910463 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-scripts\") pod \"75e67d63-2798-48de-af86-f8334eecdc26\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.910519 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75e67d63-2798-48de-af86-f8334eecdc26-logs\") pod \"75e67d63-2798-48de-af86-f8334eecdc26\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.910573 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-combined-ca-bundle\") pod \"75e67d63-2798-48de-af86-f8334eecdc26\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.910621 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-public-tls-certs\") pod \"75e67d63-2798-48de-af86-f8334eecdc26\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.910642 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"75e67d63-2798-48de-af86-f8334eecdc26\" (UID: \"75e67d63-2798-48de-af86-f8334eecdc26\") " Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.911399 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e67d63-2798-48de-af86-f8334eecdc26-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "75e67d63-2798-48de-af86-f8334eecdc26" (UID: "75e67d63-2798-48de-af86-f8334eecdc26"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.912445 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e67d63-2798-48de-af86-f8334eecdc26-logs" (OuterVolumeSpecName: "logs") pod "75e67d63-2798-48de-af86-f8334eecdc26" (UID: "75e67d63-2798-48de-af86-f8334eecdc26"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.913541 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e67d63-2798-48de-af86-f8334eecdc26-ceph" (OuterVolumeSpecName: "ceph") pod "75e67d63-2798-48de-af86-f8334eecdc26" (UID: "75e67d63-2798-48de-af86-f8334eecdc26"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.914849 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-scripts" (OuterVolumeSpecName: "scripts") pod "75e67d63-2798-48de-af86-f8334eecdc26" (UID: "75e67d63-2798-48de-af86-f8334eecdc26"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.914982 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "75e67d63-2798-48de-af86-f8334eecdc26" (UID: "75e67d63-2798-48de-af86-f8334eecdc26"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.915051 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e67d63-2798-48de-af86-f8334eecdc26-kube-api-access-ndrhv" (OuterVolumeSpecName: "kube-api-access-ndrhv") pod "75e67d63-2798-48de-af86-f8334eecdc26" (UID: "75e67d63-2798-48de-af86-f8334eecdc26"). InnerVolumeSpecName "kube-api-access-ndrhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.942161 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75e67d63-2798-48de-af86-f8334eecdc26" (UID: "75e67d63-2798-48de-af86-f8334eecdc26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.977685 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "75e67d63-2798-48de-af86-f8334eecdc26" (UID: "75e67d63-2798-48de-af86-f8334eecdc26"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.979356 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-b5a7-account-create-k477x"] Oct 05 21:09:56 crc kubenswrapper[4753]: I1005 21:09:56.979955 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-config-data" (OuterVolumeSpecName: "config-data") pod "75e67d63-2798-48de-af86-f8334eecdc26" (UID: "75e67d63-2798-48de-af86-f8334eecdc26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:09:56 crc kubenswrapper[4753]: W1005 21:09:56.984278 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bf632a4_72e3_4567_b353_bf19ee21c255.slice/crio-04dc326f5fcdcae6800c2773f0cfdc7dc2e188ab4995c2ce490eaf53fe4aa60e WatchSource:0}: Error finding container 04dc326f5fcdcae6800c2773f0cfdc7dc2e188ab4995c2ce490eaf53fe4aa60e: Status 404 returned error can't find the container with id 04dc326f5fcdcae6800c2773f0cfdc7dc2e188ab4995c2ce490eaf53fe4aa60e Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.013589 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75e67d63-2798-48de-af86-f8334eecdc26-logs\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.013618 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.013631 4753 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.013678 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.013691 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.013701 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/75e67d63-2798-48de-af86-f8334eecdc26-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.013713 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndrhv\" (UniqueName: \"kubernetes.io/projected/75e67d63-2798-48de-af86-f8334eecdc26-kube-api-access-ndrhv\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.017484 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/75e67d63-2798-48de-af86-f8334eecdc26-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.017499 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75e67d63-2798-48de-af86-f8334eecdc26-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.036692 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.118888 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.716558 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-745b9fcf5d-xkxjq" event={"ID":"e1309d62-7702-49bc-892f-705d8ac9fff3","Type":"ContainerStarted","Data":"c454f0e991ab6a7ca6efc5705b3386ebb31ddde68d5121ad2c1f365d1a718452"} Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.716902 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-745b9fcf5d-xkxjq" event={"ID":"e1309d62-7702-49bc-892f-705d8ac9fff3","Type":"ContainerStarted","Data":"8061aceec87022d04eab814c464ebb4327f32951b21a19c83e604e436d29f674"} Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.719614 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.719634 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"75e67d63-2798-48de-af86-f8334eecdc26","Type":"ContainerDied","Data":"1e673deed5e21fc7eb257514bff002e6d5a4d552be1856141d90ad6b8d25acf9"} Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.719704 4753 scope.go:117] "RemoveContainer" containerID="18cc3dfea9472fc53c0271ef46c93831db009d1ad41ce446e4aa934a4dff9e37" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.723741 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b5a7-account-create-k477x" event={"ID":"5bf632a4-72e3-4567-b353-bf19ee21c255","Type":"ContainerStarted","Data":"4ffd383613139a6b942451b9eed5aebd5c0764ae6ba72b97519fcaeac9d4d2b4"} Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.723783 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b5a7-account-create-k477x" event={"ID":"5bf632a4-72e3-4567-b353-bf19ee21c255","Type":"ContainerStarted","Data":"04dc326f5fcdcae6800c2773f0cfdc7dc2e188ab4995c2ce490eaf53fe4aa60e"} Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.726719 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85489f4f6c-b854l" event={"ID":"8d78ff70-0c68-4694-8b5b-c07b2e0c207f","Type":"ContainerStarted","Data":"792e333eb7ae0e408a3ef95c143e9498fd429ef82d743f3e265f251f456c3109"} Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.726747 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85489f4f6c-b854l" event={"ID":"8d78ff70-0c68-4694-8b5b-c07b2e0c207f","Type":"ContainerStarted","Data":"883a0dd89985bf9fb761ea63436f5f7e2f05909f6e4ffb21ee2b1481e3724c03"} Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.726845 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-85489f4f6c-b854l" podUID="8d78ff70-0c68-4694-8b5b-c07b2e0c207f" containerName="horizon-log" containerID="cri-o://883a0dd89985bf9fb761ea63436f5f7e2f05909f6e4ffb21ee2b1481e3724c03" gracePeriod=30 Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.727081 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-85489f4f6c-b854l" podUID="8d78ff70-0c68-4694-8b5b-c07b2e0c207f" containerName="horizon" containerID="cri-o://792e333eb7ae0e408a3ef95c143e9498fd429ef82d743f3e265f251f456c3109" gracePeriod=30 Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.729693 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55b7f7b494-jrzrq" event={"ID":"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac","Type":"ContainerStarted","Data":"bd538def4f88bd5b4cb375fe1ce651aa53f18a5f9a12c0076341cfb4f176b206"} Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.729833 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55b7f7b494-jrzrq" event={"ID":"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac","Type":"ContainerStarted","Data":"2fd3ed10b25399b5b6080325002d9ee0926b4851ba516017af7dd927b562f21b"} Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.732227 4753 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.732242 4753 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.736929 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c854c57b9-xprgc" podUID="9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0" containerName="horizon-log" containerID="cri-o://64698888501a404ffcf5f36069e70f76e99d374d9941758abb20c0a3890bf13b" gracePeriod=30 Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.737059 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c854c57b9-xprgc" event={"ID":"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0","Type":"ContainerStarted","Data":"5a0360cf5b542dc7b4750195c9360e0e437014289001f936dfdae6d956815bbc"} Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.737085 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c854c57b9-xprgc" event={"ID":"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0","Type":"ContainerStarted","Data":"64698888501a404ffcf5f36069e70f76e99d374d9941758abb20c0a3890bf13b"} Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.737117 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c854c57b9-xprgc" podUID="9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0" containerName="horizon" containerID="cri-o://5a0360cf5b542dc7b4750195c9360e0e437014289001f936dfdae6d956815bbc" gracePeriod=30 Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.747005 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-745b9fcf5d-xkxjq" podStartSLOduration=4.809170973 podStartE2EDuration="13.746986459s" podCreationTimestamp="2025-10-05 21:09:44 +0000 UTC" firstStartedPulling="2025-10-05 21:09:47.368925457 +0000 UTC m=+3296.217253689" lastFinishedPulling="2025-10-05 21:09:56.306740943 +0000 UTC m=+3305.155069175" observedRunningTime="2025-10-05 21:09:57.737559298 +0000 UTC m=+3306.585887530" watchObservedRunningTime="2025-10-05 21:09:57.746986459 +0000 UTC m=+3306.595314691" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.772825 4753 scope.go:117] "RemoveContainer" containerID="9ac66f7ebc83eec353e8bd2083f4f5b6e1681611e0308c8ece73e438ea1fe1e9" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.780489 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c854c57b9-xprgc" podStartSLOduration=2.930322324 podStartE2EDuration="15.780462422s" podCreationTimestamp="2025-10-05 21:09:42 +0000 UTC" firstStartedPulling="2025-10-05 21:09:43.420310906 +0000 UTC m=+3292.268639128" lastFinishedPulling="2025-10-05 21:09:56.270450994 +0000 UTC m=+3305.118779226" observedRunningTime="2025-10-05 21:09:57.768611615 +0000 UTC m=+3306.616939837" watchObservedRunningTime="2025-10-05 21:09:57.780462422 +0000 UTC m=+3306.628790654" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.792882 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-85489f4f6c-b854l" podStartSLOduration=2.886089579 podStartE2EDuration="15.792866374s" podCreationTimestamp="2025-10-05 21:09:42 +0000 UTC" firstStartedPulling="2025-10-05 21:09:43.36210083 +0000 UTC m=+3292.210429062" lastFinishedPulling="2025-10-05 21:09:56.268877625 +0000 UTC m=+3305.117205857" observedRunningTime="2025-10-05 21:09:57.788469728 +0000 UTC m=+3306.636797960" watchObservedRunningTime="2025-10-05 21:09:57.792866374 +0000 UTC m=+3306.641194606" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.813659 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-b5a7-account-create-k477x" podStartSLOduration=5.813633644 podStartE2EDuration="5.813633644s" podCreationTimestamp="2025-10-05 21:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 21:09:57.799411695 +0000 UTC m=+3306.647739927" watchObservedRunningTime="2025-10-05 21:09:57.813633644 +0000 UTC m=+3306.661961876" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.829876 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-55b7f7b494-jrzrq" podStartSLOduration=4.750172083 podStartE2EDuration="13.829852325s" podCreationTimestamp="2025-10-05 21:09:44 +0000 UTC" firstStartedPulling="2025-10-05 21:09:47.263443483 +0000 UTC m=+3296.111771715" lastFinishedPulling="2025-10-05 21:09:56.343123725 +0000 UTC m=+3305.191451957" observedRunningTime="2025-10-05 21:09:57.816498543 +0000 UTC m=+3306.664826775" watchObservedRunningTime="2025-10-05 21:09:57.829852325 +0000 UTC m=+3306.678180557" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.849606 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.881972 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.882623 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 05 21:09:57 crc kubenswrapper[4753]: E1005 21:09:57.883036 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e67d63-2798-48de-af86-f8334eecdc26" containerName="glance-log" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.883111 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e67d63-2798-48de-af86-f8334eecdc26" containerName="glance-log" Oct 05 21:09:57 crc kubenswrapper[4753]: E1005 21:09:57.883217 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e67d63-2798-48de-af86-f8334eecdc26" containerName="glance-httpd" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.883282 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e67d63-2798-48de-af86-f8334eecdc26" containerName="glance-httpd" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.883555 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e67d63-2798-48de-af86-f8334eecdc26" containerName="glance-log" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.883820 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e67d63-2798-48de-af86-f8334eecdc26" containerName="glance-httpd" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.885242 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.890526 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.891075 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.897835 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.944815 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e4e4554e-b923-40f1-ac86-abc4cb871d21-ceph\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.944937 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e4554e-b923-40f1-ac86-abc4cb871d21-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.944979 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e4554e-b923-40f1-ac86-abc4cb871d21-scripts\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.945001 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4e4554e-b923-40f1-ac86-abc4cb871d21-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.945152 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e4554e-b923-40f1-ac86-abc4cb871d21-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.945186 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e4554e-b923-40f1-ac86-abc4cb871d21-logs\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.945201 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4z9m\" (UniqueName: \"kubernetes.io/projected/e4e4554e-b923-40f1-ac86-abc4cb871d21-kube-api-access-z4z9m\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.945300 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:57 crc kubenswrapper[4753]: I1005 21:09:57.945486 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e4554e-b923-40f1-ac86-abc4cb871d21-config-data\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.047222 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e4e4554e-b923-40f1-ac86-abc4cb871d21-ceph\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.047284 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e4554e-b923-40f1-ac86-abc4cb871d21-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.047310 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e4554e-b923-40f1-ac86-abc4cb871d21-scripts\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.047329 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4e4554e-b923-40f1-ac86-abc4cb871d21-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.047376 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e4554e-b923-40f1-ac86-abc4cb871d21-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.047397 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e4554e-b923-40f1-ac86-abc4cb871d21-logs\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.047415 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4z9m\" (UniqueName: \"kubernetes.io/projected/e4e4554e-b923-40f1-ac86-abc4cb871d21-kube-api-access-z4z9m\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.047443 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.047503 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e4554e-b923-40f1-ac86-abc4cb871d21-config-data\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.048466 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4e4554e-b923-40f1-ac86-abc4cb871d21-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.049021 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.049051 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e4554e-b923-40f1-ac86-abc4cb871d21-logs\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.061095 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e4554e-b923-40f1-ac86-abc4cb871d21-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.062284 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e4554e-b923-40f1-ac86-abc4cb871d21-config-data\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.062964 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e4e4554e-b923-40f1-ac86-abc4cb871d21-ceph\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.077783 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e4554e-b923-40f1-ac86-abc4cb871d21-scripts\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.077959 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4z9m\" (UniqueName: \"kubernetes.io/projected/e4e4554e-b923-40f1-ac86-abc4cb871d21-kube-api-access-z4z9m\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.078572 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e4554e-b923-40f1-ac86-abc4cb871d21-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.090858 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e4e4554e-b923-40f1-ac86-abc4cb871d21\") " pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.222907 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.745401 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.756235 4753 generic.go:334] "Generic (PLEG): container finished" podID="5bf632a4-72e3-4567-b353-bf19ee21c255" containerID="4ffd383613139a6b942451b9eed5aebd5c0764ae6ba72b97519fcaeac9d4d2b4" exitCode=0 Oct 05 21:09:58 crc kubenswrapper[4753]: I1005 21:09:58.757743 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b5a7-account-create-k477x" event={"ID":"5bf632a4-72e3-4567-b353-bf19ee21c255","Type":"ContainerDied","Data":"4ffd383613139a6b942451b9eed5aebd5c0764ae6ba72b97519fcaeac9d4d2b4"} Oct 05 21:09:59 crc kubenswrapper[4753]: I1005 21:09:59.778103 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4e4554e-b923-40f1-ac86-abc4cb871d21","Type":"ContainerStarted","Data":"97787e74c56bef7224428045b4eca36e55a032aa07964bd8050b73ab464c126b"} Oct 05 21:09:59 crc kubenswrapper[4753]: I1005 21:09:59.778957 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4e4554e-b923-40f1-ac86-abc4cb871d21","Type":"ContainerStarted","Data":"397e9091781114566a4115a4bfa8f3276ff31a3b3d5b0b77020025bd5d04dc04"} Oct 05 21:09:59 crc kubenswrapper[4753]: I1005 21:09:59.870021 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e67d63-2798-48de-af86-f8334eecdc26" path="/var/lib/kubelet/pods/75e67d63-2798-48de-af86-f8334eecdc26/volumes" Oct 05 21:10:00 crc kubenswrapper[4753]: I1005 21:10:00.206987 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b5a7-account-create-k477x" Oct 05 21:10:00 crc kubenswrapper[4753]: I1005 21:10:00.311102 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcgj5\" (UniqueName: \"kubernetes.io/projected/5bf632a4-72e3-4567-b353-bf19ee21c255-kube-api-access-dcgj5\") pod \"5bf632a4-72e3-4567-b353-bf19ee21c255\" (UID: \"5bf632a4-72e3-4567-b353-bf19ee21c255\") " Oct 05 21:10:00 crc kubenswrapper[4753]: I1005 21:10:00.316319 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bf632a4-72e3-4567-b353-bf19ee21c255-kube-api-access-dcgj5" (OuterVolumeSpecName: "kube-api-access-dcgj5") pod "5bf632a4-72e3-4567-b353-bf19ee21c255" (UID: "5bf632a4-72e3-4567-b353-bf19ee21c255"). InnerVolumeSpecName "kube-api-access-dcgj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:10:00 crc kubenswrapper[4753]: I1005 21:10:00.413390 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcgj5\" (UniqueName: \"kubernetes.io/projected/5bf632a4-72e3-4567-b353-bf19ee21c255-kube-api-access-dcgj5\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:00 crc kubenswrapper[4753]: I1005 21:10:00.585270 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 05 21:10:00 crc kubenswrapper[4753]: I1005 21:10:00.585391 4753 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 05 21:10:00 crc kubenswrapper[4753]: I1005 21:10:00.625648 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 05 21:10:00 crc kubenswrapper[4753]: I1005 21:10:00.789295 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b5a7-account-create-k477x" Oct 05 21:10:00 crc kubenswrapper[4753]: I1005 21:10:00.789302 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b5a7-account-create-k477x" event={"ID":"5bf632a4-72e3-4567-b353-bf19ee21c255","Type":"ContainerDied","Data":"04dc326f5fcdcae6800c2773f0cfdc7dc2e188ab4995c2ce490eaf53fe4aa60e"} Oct 05 21:10:00 crc kubenswrapper[4753]: I1005 21:10:00.789646 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04dc326f5fcdcae6800c2773f0cfdc7dc2e188ab4995c2ce490eaf53fe4aa60e" Oct 05 21:10:00 crc kubenswrapper[4753]: I1005 21:10:00.791815 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4e4554e-b923-40f1-ac86-abc4cb871d21","Type":"ContainerStarted","Data":"5cedd84976eb3242cd4b1201a461b05e61f6e60a6cd0b58ebde05148c63c44b3"} Oct 05 21:10:00 crc kubenswrapper[4753]: I1005 21:10:00.821952 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.821934469 podStartE2EDuration="3.821934469s" podCreationTimestamp="2025-10-05 21:09:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 21:10:00.816905283 +0000 UTC m=+3309.665233515" watchObservedRunningTime="2025-10-05 21:10:00.821934469 +0000 UTC m=+3309.670262701" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.576441 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-k5rlq"] Oct 05 21:10:02 crc kubenswrapper[4753]: E1005 21:10:02.577017 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf632a4-72e3-4567-b353-bf19ee21c255" containerName="mariadb-account-create" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.577030 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf632a4-72e3-4567-b353-bf19ee21c255" containerName="mariadb-account-create" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.577254 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bf632a4-72e3-4567-b353-bf19ee21c255" containerName="mariadb-account-create" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.578094 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-k5rlq" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.582428 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-lkqhl" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.582554 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.585060 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-k5rlq"] Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.657655 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-config-data\") pod \"manila-db-sync-k5rlq\" (UID: \"31365694-05a6-4386-98db-b2054a6464f4\") " pod="openstack/manila-db-sync-k5rlq" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.658209 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd5mn\" (UniqueName: \"kubernetes.io/projected/31365694-05a6-4386-98db-b2054a6464f4-kube-api-access-xd5mn\") pod \"manila-db-sync-k5rlq\" (UID: \"31365694-05a6-4386-98db-b2054a6464f4\") " pod="openstack/manila-db-sync-k5rlq" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.658289 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-combined-ca-bundle\") pod \"manila-db-sync-k5rlq\" (UID: \"31365694-05a6-4386-98db-b2054a6464f4\") " pod="openstack/manila-db-sync-k5rlq" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.658423 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-job-config-data\") pod \"manila-db-sync-k5rlq\" (UID: \"31365694-05a6-4386-98db-b2054a6464f4\") " pod="openstack/manila-db-sync-k5rlq" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.757064 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.760291 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-job-config-data\") pod \"manila-db-sync-k5rlq\" (UID: \"31365694-05a6-4386-98db-b2054a6464f4\") " pod="openstack/manila-db-sync-k5rlq" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.760356 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-config-data\") pod \"manila-db-sync-k5rlq\" (UID: \"31365694-05a6-4386-98db-b2054a6464f4\") " pod="openstack/manila-db-sync-k5rlq" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.760441 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd5mn\" (UniqueName: \"kubernetes.io/projected/31365694-05a6-4386-98db-b2054a6464f4-kube-api-access-xd5mn\") pod \"manila-db-sync-k5rlq\" (UID: \"31365694-05a6-4386-98db-b2054a6464f4\") " pod="openstack/manila-db-sync-k5rlq" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.760773 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-combined-ca-bundle\") pod \"manila-db-sync-k5rlq\" (UID: \"31365694-05a6-4386-98db-b2054a6464f4\") " pod="openstack/manila-db-sync-k5rlq" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.766730 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-job-config-data\") pod \"manila-db-sync-k5rlq\" (UID: \"31365694-05a6-4386-98db-b2054a6464f4\") " pod="openstack/manila-db-sync-k5rlq" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.767383 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-combined-ca-bundle\") pod \"manila-db-sync-k5rlq\" (UID: \"31365694-05a6-4386-98db-b2054a6464f4\") " pod="openstack/manila-db-sync-k5rlq" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.777895 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-config-data\") pod \"manila-db-sync-k5rlq\" (UID: \"31365694-05a6-4386-98db-b2054a6464f4\") " pod="openstack/manila-db-sync-k5rlq" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.789402 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd5mn\" (UniqueName: \"kubernetes.io/projected/31365694-05a6-4386-98db-b2054a6464f4-kube-api-access-xd5mn\") pod \"manila-db-sync-k5rlq\" (UID: \"31365694-05a6-4386-98db-b2054a6464f4\") " pod="openstack/manila-db-sync-k5rlq" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.802056 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:10:02 crc kubenswrapper[4753]: I1005 21:10:02.893485 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-k5rlq" Oct 05 21:10:03 crc kubenswrapper[4753]: I1005 21:10:03.703117 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-k5rlq"] Oct 05 21:10:03 crc kubenswrapper[4753]: W1005 21:10:03.722536 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31365694_05a6_4386_98db_b2054a6464f4.slice/crio-18e382f0237bfd592d37532cab1aca49627e4e43502a104bce63e67c0b968df7 WatchSource:0}: Error finding container 18e382f0237bfd592d37532cab1aca49627e4e43502a104bce63e67c0b968df7: Status 404 returned error can't find the container with id 18e382f0237bfd592d37532cab1aca49627e4e43502a104bce63e67c0b968df7 Oct 05 21:10:03 crc kubenswrapper[4753]: I1005 21:10:03.815183 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-k5rlq" event={"ID":"31365694-05a6-4386-98db-b2054a6464f4","Type":"ContainerStarted","Data":"18e382f0237bfd592d37532cab1aca49627e4e43502a104bce63e67c0b968df7"} Oct 05 21:10:05 crc kubenswrapper[4753]: I1005 21:10:05.316011 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:10:05 crc kubenswrapper[4753]: I1005 21:10:05.317178 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:10:05 crc kubenswrapper[4753]: I1005 21:10:05.427767 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:10:05 crc kubenswrapper[4753]: I1005 21:10:05.428245 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:10:08 crc kubenswrapper[4753]: I1005 21:10:08.223606 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 05 21:10:08 crc kubenswrapper[4753]: I1005 21:10:08.224230 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 05 21:10:08 crc kubenswrapper[4753]: I1005 21:10:08.299067 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 05 21:10:08 crc kubenswrapper[4753]: I1005 21:10:08.329524 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 05 21:10:08 crc kubenswrapper[4753]: I1005 21:10:08.855700 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 05 21:10:08 crc kubenswrapper[4753]: I1005 21:10:08.855751 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 05 21:10:11 crc kubenswrapper[4753]: I1005 21:10:11.223094 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 05 21:10:11 crc kubenswrapper[4753]: I1005 21:10:11.224703 4753 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 05 21:10:11 crc kubenswrapper[4753]: I1005 21:10:11.312783 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 05 21:10:11 crc kubenswrapper[4753]: I1005 21:10:11.908259 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-k5rlq" event={"ID":"31365694-05a6-4386-98db-b2054a6464f4","Type":"ContainerStarted","Data":"12db8e1d570e1e8ffac739b0f8836070f44c7198293bce0878f2d0ec9414634d"} Oct 05 21:10:11 crc kubenswrapper[4753]: I1005 21:10:11.928661 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-k5rlq" podStartSLOduration=2.909628807 podStartE2EDuration="9.928637906s" podCreationTimestamp="2025-10-05 21:10:02 +0000 UTC" firstStartedPulling="2025-10-05 21:10:03.725298706 +0000 UTC m=+3312.573626938" lastFinishedPulling="2025-10-05 21:10:10.744307805 +0000 UTC m=+3319.592636037" observedRunningTime="2025-10-05 21:10:11.923287291 +0000 UTC m=+3320.771615523" watchObservedRunningTime="2025-10-05 21:10:11.928637906 +0000 UTC m=+3320.776966138" Oct 05 21:10:15 crc kubenswrapper[4753]: I1005 21:10:15.317109 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-55b7f7b494-jrzrq" podUID="ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.244:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.244:8443: connect: connection refused" Oct 05 21:10:15 crc kubenswrapper[4753]: I1005 21:10:15.428876 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-745b9fcf5d-xkxjq" podUID="e1309d62-7702-49bc-892f-705d8ac9fff3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.246:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.246:8443: connect: connection refused" Oct 05 21:10:22 crc kubenswrapper[4753]: I1005 21:10:22.995354 4753 generic.go:334] "Generic (PLEG): container finished" podID="31365694-05a6-4386-98db-b2054a6464f4" containerID="12db8e1d570e1e8ffac739b0f8836070f44c7198293bce0878f2d0ec9414634d" exitCode=0 Oct 05 21:10:22 crc kubenswrapper[4753]: I1005 21:10:22.995457 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-k5rlq" event={"ID":"31365694-05a6-4386-98db-b2054a6464f4","Type":"ContainerDied","Data":"12db8e1d570e1e8ffac739b0f8836070f44c7198293bce0878f2d0ec9414634d"} Oct 05 21:10:24 crc kubenswrapper[4753]: I1005 21:10:24.537950 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-k5rlq" Oct 05 21:10:24 crc kubenswrapper[4753]: I1005 21:10:24.631996 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd5mn\" (UniqueName: \"kubernetes.io/projected/31365694-05a6-4386-98db-b2054a6464f4-kube-api-access-xd5mn\") pod \"31365694-05a6-4386-98db-b2054a6464f4\" (UID: \"31365694-05a6-4386-98db-b2054a6464f4\") " Oct 05 21:10:24 crc kubenswrapper[4753]: I1005 21:10:24.632050 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-combined-ca-bundle\") pod \"31365694-05a6-4386-98db-b2054a6464f4\" (UID: \"31365694-05a6-4386-98db-b2054a6464f4\") " Oct 05 21:10:24 crc kubenswrapper[4753]: I1005 21:10:24.632266 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-config-data\") pod \"31365694-05a6-4386-98db-b2054a6464f4\" (UID: \"31365694-05a6-4386-98db-b2054a6464f4\") " Oct 05 21:10:24 crc kubenswrapper[4753]: I1005 21:10:24.632300 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-job-config-data\") pod \"31365694-05a6-4386-98db-b2054a6464f4\" (UID: \"31365694-05a6-4386-98db-b2054a6464f4\") " Oct 05 21:10:24 crc kubenswrapper[4753]: I1005 21:10:24.641266 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "31365694-05a6-4386-98db-b2054a6464f4" (UID: "31365694-05a6-4386-98db-b2054a6464f4"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:24 crc kubenswrapper[4753]: I1005 21:10:24.641411 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31365694-05a6-4386-98db-b2054a6464f4-kube-api-access-xd5mn" (OuterVolumeSpecName: "kube-api-access-xd5mn") pod "31365694-05a6-4386-98db-b2054a6464f4" (UID: "31365694-05a6-4386-98db-b2054a6464f4"). InnerVolumeSpecName "kube-api-access-xd5mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:10:24 crc kubenswrapper[4753]: I1005 21:10:24.646277 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-config-data" (OuterVolumeSpecName: "config-data") pod "31365694-05a6-4386-98db-b2054a6464f4" (UID: "31365694-05a6-4386-98db-b2054a6464f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:24 crc kubenswrapper[4753]: I1005 21:10:24.673433 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31365694-05a6-4386-98db-b2054a6464f4" (UID: "31365694-05a6-4386-98db-b2054a6464f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:24 crc kubenswrapper[4753]: I1005 21:10:24.734693 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:24 crc kubenswrapper[4753]: I1005 21:10:24.734742 4753 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:24 crc kubenswrapper[4753]: I1005 21:10:24.734760 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd5mn\" (UniqueName: \"kubernetes.io/projected/31365694-05a6-4386-98db-b2054a6464f4-kube-api-access-xd5mn\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:24 crc kubenswrapper[4753]: I1005 21:10:24.734773 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31365694-05a6-4386-98db-b2054a6464f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.012066 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-k5rlq" event={"ID":"31365694-05a6-4386-98db-b2054a6464f4","Type":"ContainerDied","Data":"18e382f0237bfd592d37532cab1aca49627e4e43502a104bce63e67c0b968df7"} Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.012361 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18e382f0237bfd592d37532cab1aca49627e4e43502a104bce63e67c0b968df7" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.012108 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-k5rlq" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.321691 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 05 21:10:25 crc kubenswrapper[4753]: E1005 21:10:25.322752 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31365694-05a6-4386-98db-b2054a6464f4" containerName="manila-db-sync" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.322865 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="31365694-05a6-4386-98db-b2054a6464f4" containerName="manila-db-sync" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.323325 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="31365694-05a6-4386-98db-b2054a6464f4" containerName="manila-db-sync" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.324728 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.329158 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.329670 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-lkqhl" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.330297 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.330536 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.330758 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.331021 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.338646 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.404009 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.435904 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.453109 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07d7a9f9-7e60-4c3f-a87b-480299693d92-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.453159 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-scripts\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.453197 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26sr4\" (UniqueName: \"kubernetes.io/projected/07d7a9f9-7e60-4c3f-a87b-480299693d92-kube-api-access-26sr4\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.453228 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-config-data\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.453272 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.453299 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.453325 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.453347 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/07d7a9f9-7e60-4c3f-a87b-480299693d92-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.453369 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/07d7a9f9-7e60-4c3f-a87b-480299693d92-ceph\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.453390 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-config-data\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.453424 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-scripts\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.453492 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e054bfac-2418-40f6-9744-3e490a616e70-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.453507 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np6mh\" (UniqueName: \"kubernetes.io/projected/e054bfac-2418-40f6-9744-3e490a616e70-kube-api-access-np6mh\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.453541 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.557159 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-scripts\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.557489 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e054bfac-2418-40f6-9744-3e490a616e70-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.557510 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np6mh\" (UniqueName: \"kubernetes.io/projected/e054bfac-2418-40f6-9744-3e490a616e70-kube-api-access-np6mh\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.557547 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.557595 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07d7a9f9-7e60-4c3f-a87b-480299693d92-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.557615 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-scripts\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.557648 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26sr4\" (UniqueName: \"kubernetes.io/projected/07d7a9f9-7e60-4c3f-a87b-480299693d92-kube-api-access-26sr4\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.557675 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-config-data\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.557707 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.557733 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.557955 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.557977 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/07d7a9f9-7e60-4c3f-a87b-480299693d92-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.558000 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/07d7a9f9-7e60-4c3f-a87b-480299693d92-ceph\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.558027 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-config-data\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.559810 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07d7a9f9-7e60-4c3f-a87b-480299693d92-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.560170 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e054bfac-2418-40f6-9744-3e490a616e70-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.560281 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/07d7a9f9-7e60-4c3f-a87b-480299693d92-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.565570 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.566253 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.566721 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-config-data\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.567851 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-scripts\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.567925 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-config-data\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.570642 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/07d7a9f9-7e60-4c3f-a87b-480299693d92-ceph\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.571609 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.576820 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.590057 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-scripts\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.601561 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np6mh\" (UniqueName: \"kubernetes.io/projected/e054bfac-2418-40f6-9744-3e490a616e70-kube-api-access-np6mh\") pod \"manila-scheduler-0\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.610859 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26sr4\" (UniqueName: \"kubernetes.io/projected/07d7a9f9-7e60-4c3f-a87b-480299693d92-kube-api-access-26sr4\") pod \"manila-share-share1-0\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.669592 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.676568 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.751733 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67948f47bf-jnd5v"] Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.753799 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.782999 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67948f47bf-jnd5v"] Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.864259 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdjnm\" (UniqueName: \"kubernetes.io/projected/16d16fc5-ebf6-49b5-a837-7b19a005ee21-kube-api-access-vdjnm\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.864343 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16d16fc5-ebf6-49b5-a837-7b19a005ee21-ovsdbserver-nb\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.864370 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16d16fc5-ebf6-49b5-a837-7b19a005ee21-config\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.864387 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16d16fc5-ebf6-49b5-a837-7b19a005ee21-dns-svc\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.864429 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16d16fc5-ebf6-49b5-a837-7b19a005ee21-ovsdbserver-sb\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.864486 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/16d16fc5-ebf6-49b5-a837-7b19a005ee21-openstack-edpm-ipam\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.969640 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/16d16fc5-ebf6-49b5-a837-7b19a005ee21-openstack-edpm-ipam\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.969755 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdjnm\" (UniqueName: \"kubernetes.io/projected/16d16fc5-ebf6-49b5-a837-7b19a005ee21-kube-api-access-vdjnm\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.969840 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16d16fc5-ebf6-49b5-a837-7b19a005ee21-ovsdbserver-nb\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.969864 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16d16fc5-ebf6-49b5-a837-7b19a005ee21-config\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.969883 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16d16fc5-ebf6-49b5-a837-7b19a005ee21-dns-svc\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.969939 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16d16fc5-ebf6-49b5-a837-7b19a005ee21-ovsdbserver-sb\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.974550 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/16d16fc5-ebf6-49b5-a837-7b19a005ee21-openstack-edpm-ipam\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.975983 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16d16fc5-ebf6-49b5-a837-7b19a005ee21-config\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.976871 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16d16fc5-ebf6-49b5-a837-7b19a005ee21-ovsdbserver-nb\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.977394 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16d16fc5-ebf6-49b5-a837-7b19a005ee21-ovsdbserver-sb\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:25 crc kubenswrapper[4753]: I1005 21:10:25.977934 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16d16fc5-ebf6-49b5-a837-7b19a005ee21-dns-svc\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.020272 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdjnm\" (UniqueName: \"kubernetes.io/projected/16d16fc5-ebf6-49b5-a837-7b19a005ee21-kube-api-access-vdjnm\") pod \"dnsmasq-dns-67948f47bf-jnd5v\" (UID: \"16d16fc5-ebf6-49b5-a837-7b19a005ee21\") " pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.020367 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.023210 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.028396 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.061812 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.071282 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4ddd\" (UniqueName: \"kubernetes.io/projected/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-kube-api-access-j4ddd\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.071345 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-scripts\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.071384 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.071411 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-config-data-custom\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.071426 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-logs\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.071485 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-config-data\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.071532 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-etc-machine-id\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.183785 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.183792 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-config-data\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.184485 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-etc-machine-id\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.184557 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4ddd\" (UniqueName: \"kubernetes.io/projected/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-kube-api-access-j4ddd\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.184616 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-scripts\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.184665 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.184699 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-config-data-custom\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.184719 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-logs\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.185132 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-logs\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.189435 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-etc-machine-id\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.193896 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-config-data-custom\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.195813 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-scripts\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.197572 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-config-data\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.198211 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.238195 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4ddd\" (UniqueName: \"kubernetes.io/projected/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-kube-api-access-j4ddd\") pod \"manila-api-0\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.379571 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.446488 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.467595 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.718037 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67948f47bf-jnd5v"] Oct 05 21:10:26 crc kubenswrapper[4753]: I1005 21:10:26.828880 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 05 21:10:27 crc kubenswrapper[4753]: I1005 21:10:27.047534 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 05 21:10:27 crc kubenswrapper[4753]: W1005 21:10:27.050264 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7a25a47_9ebd_4ed7_bd8d_0dc961f2df85.slice/crio-e3adce6099ad5cd136294f2748b12541d565a062c04c4b7b7ca4fb5899339c1b WatchSource:0}: Error finding container e3adce6099ad5cd136294f2748b12541d565a062c04c4b7b7ca4fb5899339c1b: Status 404 returned error can't find the container with id e3adce6099ad5cd136294f2748b12541d565a062c04c4b7b7ca4fb5899339c1b Oct 05 21:10:27 crc kubenswrapper[4753]: I1005 21:10:27.078552 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"07d7a9f9-7e60-4c3f-a87b-480299693d92","Type":"ContainerStarted","Data":"e5d4bb7abe653f30a420905fc8e26b2976301848a6733d7d0bf82a7e1d5949a8"} Oct 05 21:10:27 crc kubenswrapper[4753]: I1005 21:10:27.081496 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e054bfac-2418-40f6-9744-3e490a616e70","Type":"ContainerStarted","Data":"e2a2b20d11461caf35bbfcdd486d7660b8c7eac6a5f55c4f4f13e4812308b873"} Oct 05 21:10:27 crc kubenswrapper[4753]: I1005 21:10:27.086608 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" event={"ID":"16d16fc5-ebf6-49b5-a837-7b19a005ee21","Type":"ContainerStarted","Data":"d70005bdaccd7ab0f60ce2fa43e350108d8a76a1177df5a5170ffd9a19c01504"} Oct 05 21:10:27 crc kubenswrapper[4753]: I1005 21:10:27.092214 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85","Type":"ContainerStarted","Data":"e3adce6099ad5cd136294f2748b12541d565a062c04c4b7b7ca4fb5899339c1b"} Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.101504 4753 generic.go:334] "Generic (PLEG): container finished" podID="16d16fc5-ebf6-49b5-a837-7b19a005ee21" containerID="dec57380c1a1aa73f0be7a6413e1b9bdd7b98c99c9410f64691eb83638e8fe74" exitCode=0 Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.102025 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" event={"ID":"16d16fc5-ebf6-49b5-a837-7b19a005ee21","Type":"ContainerDied","Data":"dec57380c1a1aa73f0be7a6413e1b9bdd7b98c99c9410f64691eb83638e8fe74"} Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.136051 4753 generic.go:334] "Generic (PLEG): container finished" podID="8d78ff70-0c68-4694-8b5b-c07b2e0c207f" containerID="792e333eb7ae0e408a3ef95c143e9498fd429ef82d743f3e265f251f456c3109" exitCode=137 Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.136382 4753 generic.go:334] "Generic (PLEG): container finished" podID="8d78ff70-0c68-4694-8b5b-c07b2e0c207f" containerID="883a0dd89985bf9fb761ea63436f5f7e2f05909f6e4ffb21ee2b1481e3724c03" exitCode=137 Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.136514 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85489f4f6c-b854l" event={"ID":"8d78ff70-0c68-4694-8b5b-c07b2e0c207f","Type":"ContainerDied","Data":"792e333eb7ae0e408a3ef95c143e9498fd429ef82d743f3e265f251f456c3109"} Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.136595 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85489f4f6c-b854l" event={"ID":"8d78ff70-0c68-4694-8b5b-c07b2e0c207f","Type":"ContainerDied","Data":"883a0dd89985bf9fb761ea63436f5f7e2f05909f6e4ffb21ee2b1481e3724c03"} Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.140398 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85","Type":"ContainerStarted","Data":"f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47"} Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.143999 4753 generic.go:334] "Generic (PLEG): container finished" podID="9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0" containerID="5a0360cf5b542dc7b4750195c9360e0e437014289001f936dfdae6d956815bbc" exitCode=137 Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.144028 4753 generic.go:334] "Generic (PLEG): container finished" podID="9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0" containerID="64698888501a404ffcf5f36069e70f76e99d374d9941758abb20c0a3890bf13b" exitCode=137 Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.144052 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c854c57b9-xprgc" event={"ID":"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0","Type":"ContainerDied","Data":"5a0360cf5b542dc7b4750195c9360e0e437014289001f936dfdae6d956815bbc"} Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.144083 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c854c57b9-xprgc" event={"ID":"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0","Type":"ContainerDied","Data":"64698888501a404ffcf5f36069e70f76e99d374d9941758abb20c0a3890bf13b"} Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.506260 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.552365 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.687744 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-scripts\") pod \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.687779 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-config-data\") pod \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.687812 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-config-data\") pod \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.687855 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-logs\") pod \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.687923 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-horizon-secret-key\") pod \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.687960 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-logs\") pod \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.688046 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb57t\" (UniqueName: \"kubernetes.io/projected/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-kube-api-access-wb57t\") pod \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.688068 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd4vn\" (UniqueName: \"kubernetes.io/projected/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-kube-api-access-nd4vn\") pod \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.688086 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-horizon-secret-key\") pod \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\" (UID: \"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0\") " Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.688116 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-scripts\") pod \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\" (UID: \"8d78ff70-0c68-4694-8b5b-c07b2e0c207f\") " Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.694741 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-logs" (OuterVolumeSpecName: "logs") pod "9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0" (UID: "9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.701295 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-logs" (OuterVolumeSpecName: "logs") pod "8d78ff70-0c68-4694-8b5b-c07b2e0c207f" (UID: "8d78ff70-0c68-4694-8b5b-c07b2e0c207f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.715966 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-kube-api-access-nd4vn" (OuterVolumeSpecName: "kube-api-access-nd4vn") pod "9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0" (UID: "9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0"). InnerVolumeSpecName "kube-api-access-nd4vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.730965 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-kube-api-access-wb57t" (OuterVolumeSpecName: "kube-api-access-wb57t") pod "8d78ff70-0c68-4694-8b5b-c07b2e0c207f" (UID: "8d78ff70-0c68-4694-8b5b-c07b2e0c207f"). InnerVolumeSpecName "kube-api-access-wb57t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.734226 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0" (UID: "9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.744208 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8d78ff70-0c68-4694-8b5b-c07b2e0c207f" (UID: "8d78ff70-0c68-4694-8b5b-c07b2e0c207f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.791946 4753 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.791992 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-logs\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.792002 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb57t\" (UniqueName: \"kubernetes.io/projected/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-kube-api-access-wb57t\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.792015 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd4vn\" (UniqueName: \"kubernetes.io/projected/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-kube-api-access-nd4vn\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.792100 4753 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.792111 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-logs\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.797178 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-config-data" (OuterVolumeSpecName: "config-data") pod "9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0" (UID: "9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.807322 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-scripts" (OuterVolumeSpecName: "scripts") pod "8d78ff70-0c68-4694-8b5b-c07b2e0c207f" (UID: "8d78ff70-0c68-4694-8b5b-c07b2e0c207f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.811446 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-config-data" (OuterVolumeSpecName: "config-data") pod "8d78ff70-0c68-4694-8b5b-c07b2e0c207f" (UID: "8d78ff70-0c68-4694-8b5b-c07b2e0c207f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.812439 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-scripts" (OuterVolumeSpecName: "scripts") pod "9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0" (UID: "9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.893957 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.893985 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.894012 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:28 crc kubenswrapper[4753]: I1005 21:10:28.894020 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d78ff70-0c68-4694-8b5b-c07b2e0c207f-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.160509 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e054bfac-2418-40f6-9744-3e490a616e70","Type":"ContainerStarted","Data":"4c91d52ec12a3193146c1d03144a6efb0904a8040935097cd50f721bfb459a3c"} Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.160560 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e054bfac-2418-40f6-9744-3e490a616e70","Type":"ContainerStarted","Data":"cfd96578c8aa2bb625423365c4ecd52e9795850174928395e6f2e088e0190c73"} Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.165342 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" event={"ID":"16d16fc5-ebf6-49b5-a837-7b19a005ee21","Type":"ContainerStarted","Data":"c37fe7d0f4af18ca99a7f71358512b91b62312f46bf9e107d5fe02d792339297"} Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.165382 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.171875 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85489f4f6c-b854l" event={"ID":"8d78ff70-0c68-4694-8b5b-c07b2e0c207f","Type":"ContainerDied","Data":"6c6cc5aae59a503a47d225f94dd5c64d5c7aca8176a26ddf07281f349b1dc12b"} Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.171926 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85489f4f6c-b854l" Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.171948 4753 scope.go:117] "RemoveContainer" containerID="792e333eb7ae0e408a3ef95c143e9498fd429ef82d743f3e265f251f456c3109" Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.177231 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85","Type":"ContainerStarted","Data":"36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040"} Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.177296 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.186622 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c854c57b9-xprgc" Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.190829 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c854c57b9-xprgc" event={"ID":"9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0","Type":"ContainerDied","Data":"8e3e9a51b2ddc2a6f56137e08114e2228fb63862167cc7b4ead8152204d63ab2"} Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.210341 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.084010646 podStartE2EDuration="4.210318328s" podCreationTimestamp="2025-10-05 21:10:25 +0000 UTC" firstStartedPulling="2025-10-05 21:10:26.467384739 +0000 UTC m=+3335.315712971" lastFinishedPulling="2025-10-05 21:10:27.593692421 +0000 UTC m=+3336.442020653" observedRunningTime="2025-10-05 21:10:29.195115208 +0000 UTC m=+3338.043443440" watchObservedRunningTime="2025-10-05 21:10:29.210318328 +0000 UTC m=+3338.058646560" Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.246340 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-85489f4f6c-b854l"] Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.287507 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-85489f4f6c-b854l"] Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.288851 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" podStartSLOduration=4.28883653 podStartE2EDuration="4.28883653s" podCreationTimestamp="2025-10-05 21:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 21:10:29.28659816 +0000 UTC m=+3338.134926392" watchObservedRunningTime="2025-10-05 21:10:29.28883653 +0000 UTC m=+3338.137164762" Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.315274 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.321663 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.321643362 podStartE2EDuration="4.321643362s" podCreationTimestamp="2025-10-05 21:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 21:10:29.307690712 +0000 UTC m=+3338.156018944" watchObservedRunningTime="2025-10-05 21:10:29.321643362 +0000 UTC m=+3338.169971594" Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.352966 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c854c57b9-xprgc"] Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.353049 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c854c57b9-xprgc"] Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.460422 4753 scope.go:117] "RemoveContainer" containerID="883a0dd89985bf9fb761ea63436f5f7e2f05909f6e4ffb21ee2b1481e3724c03" Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.497752 4753 scope.go:117] "RemoveContainer" containerID="5a0360cf5b542dc7b4750195c9360e0e437014289001f936dfdae6d956815bbc" Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.685816 4753 scope.go:117] "RemoveContainer" containerID="64698888501a404ffcf5f36069e70f76e99d374d9941758abb20c0a3890bf13b" Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.864943 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d78ff70-0c68-4694-8b5b-c07b2e0c207f" path="/var/lib/kubelet/pods/8d78ff70-0c68-4694-8b5b-c07b2e0c207f/volumes" Oct 05 21:10:29 crc kubenswrapper[4753]: I1005 21:10:29.865773 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0" path="/var/lib/kubelet/pods/9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0/volumes" Oct 05 21:10:30 crc kubenswrapper[4753]: I1005 21:10:30.315804 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:10:30 crc kubenswrapper[4753]: I1005 21:10:30.425739 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:10:31 crc kubenswrapper[4753]: I1005 21:10:31.206757 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" containerName="manila-api-log" containerID="cri-o://f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47" gracePeriod=30 Oct 05 21:10:31 crc kubenswrapper[4753]: I1005 21:10:31.207056 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" containerName="manila-api" containerID="cri-o://36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040" gracePeriod=30 Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.006262 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.076588 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-logs\") pod \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.076646 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-combined-ca-bundle\") pod \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.076674 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-config-data\") pod \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.076703 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-scripts\") pod \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.076728 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-config-data-custom\") pod \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.076789 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-etc-machine-id\") pod \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.076814 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4ddd\" (UniqueName: \"kubernetes.io/projected/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-kube-api-access-j4ddd\") pod \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\" (UID: \"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85\") " Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.078398 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-logs" (OuterVolumeSpecName: "logs") pod "b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" (UID: "b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.078936 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" (UID: "b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.088890 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" (UID: "b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.091053 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-kube-api-access-j4ddd" (OuterVolumeSpecName: "kube-api-access-j4ddd") pod "b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" (UID: "b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85"). InnerVolumeSpecName "kube-api-access-j4ddd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.111622 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-scripts" (OuterVolumeSpecName: "scripts") pod "b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" (UID: "b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.159318 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" (UID: "b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.178131 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-config-data" (OuterVolumeSpecName: "config-data") pod "b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" (UID: "b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.178773 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.178800 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.178810 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.178818 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.178827 4753 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.178835 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4ddd\" (UniqueName: \"kubernetes.io/projected/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-kube-api-access-j4ddd\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.178846 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85-logs\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.218975 4753 generic.go:334] "Generic (PLEG): container finished" podID="b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" containerID="36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040" exitCode=0 Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.219006 4753 generic.go:334] "Generic (PLEG): container finished" podID="b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" containerID="f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47" exitCode=143 Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.219026 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85","Type":"ContainerDied","Data":"36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040"} Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.219052 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85","Type":"ContainerDied","Data":"f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47"} Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.219062 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85","Type":"ContainerDied","Data":"e3adce6099ad5cd136294f2748b12541d565a062c04c4b7b7ca4fb5899339c1b"} Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.219065 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.219077 4753 scope.go:117] "RemoveContainer" containerID="36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.252214 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.254181 4753 scope.go:117] "RemoveContainer" containerID="f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.262391 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.280161 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 05 21:10:32 crc kubenswrapper[4753]: E1005 21:10:32.280531 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d78ff70-0c68-4694-8b5b-c07b2e0c207f" containerName="horizon" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.280550 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d78ff70-0c68-4694-8b5b-c07b2e0c207f" containerName="horizon" Oct 05 21:10:32 crc kubenswrapper[4753]: E1005 21:10:32.280571 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0" containerName="horizon" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.280595 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0" containerName="horizon" Oct 05 21:10:32 crc kubenswrapper[4753]: E1005 21:10:32.280625 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0" containerName="horizon-log" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.280632 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0" containerName="horizon-log" Oct 05 21:10:32 crc kubenswrapper[4753]: E1005 21:10:32.280643 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" containerName="manila-api-log" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.280649 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" containerName="manila-api-log" Oct 05 21:10:32 crc kubenswrapper[4753]: E1005 21:10:32.280658 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" containerName="manila-api" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.280664 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" containerName="manila-api" Oct 05 21:10:32 crc kubenswrapper[4753]: E1005 21:10:32.280675 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d78ff70-0c68-4694-8b5b-c07b2e0c207f" containerName="horizon-log" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.280680 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d78ff70-0c68-4694-8b5b-c07b2e0c207f" containerName="horizon-log" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.280836 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0" containerName="horizon-log" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.280850 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d78ff70-0c68-4694-8b5b-c07b2e0c207f" containerName="horizon" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.280862 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" containerName="manila-api-log" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.280876 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c8dd0cb-cdf9-49ae-91eb-4dd161894ee0" containerName="horizon" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.280887 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" containerName="manila-api" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.280897 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d78ff70-0c68-4694-8b5b-c07b2e0c207f" containerName="horizon-log" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.282481 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.290813 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.291010 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.291312 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.326549 4753 scope.go:117] "RemoveContainer" containerID="36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.333523 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 05 21:10:32 crc kubenswrapper[4753]: E1005 21:10:32.341980 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040\": container with ID starting with 36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040 not found: ID does not exist" containerID="36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.342123 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040"} err="failed to get container status \"36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040\": rpc error: code = NotFound desc = could not find container \"36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040\": container with ID starting with 36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040 not found: ID does not exist" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.342232 4753 scope.go:117] "RemoveContainer" containerID="f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47" Oct 05 21:10:32 crc kubenswrapper[4753]: E1005 21:10:32.342565 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47\": container with ID starting with f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47 not found: ID does not exist" containerID="f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.342636 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47"} err="failed to get container status \"f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47\": rpc error: code = NotFound desc = could not find container \"f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47\": container with ID starting with f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47 not found: ID does not exist" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.342695 4753 scope.go:117] "RemoveContainer" containerID="36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.346992 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040"} err="failed to get container status \"36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040\": rpc error: code = NotFound desc = could not find container \"36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040\": container with ID starting with 36eeb9ee1b85eb51a20cb63d3b9eaeae2d1d73cc017bdb5177108175332a7040 not found: ID does not exist" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.347105 4753 scope.go:117] "RemoveContainer" containerID="f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.355891 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47"} err="failed to get container status \"f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47\": rpc error: code = NotFound desc = could not find container \"f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47\": container with ID starting with f1fb8456ea148a6d45073ce31829f3e75669d4a12685a8635f45b326543bcb47 not found: ID does not exist" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.384161 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.384212 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-scripts\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.384234 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-internal-tls-certs\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.384308 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-config-data-custom\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.384338 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1597495-6f4a-4887-bacc-8082ad9784d4-logs\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.384360 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-public-tls-certs\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.384376 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-config-data\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.384432 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d1597495-6f4a-4887-bacc-8082ad9784d4-etc-machine-id\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.384477 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgbjp\" (UniqueName: \"kubernetes.io/projected/d1597495-6f4a-4887-bacc-8082ad9784d4-kube-api-access-wgbjp\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.485680 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.486011 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-scripts\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.486034 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-internal-tls-certs\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.486426 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-config-data-custom\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.486462 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1597495-6f4a-4887-bacc-8082ad9784d4-logs\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.486485 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-public-tls-certs\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.486502 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-config-data\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.486547 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d1597495-6f4a-4887-bacc-8082ad9784d4-etc-machine-id\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.486601 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgbjp\" (UniqueName: \"kubernetes.io/projected/d1597495-6f4a-4887-bacc-8082ad9784d4-kube-api-access-wgbjp\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.488775 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d1597495-6f4a-4887-bacc-8082ad9784d4-etc-machine-id\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.489088 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1597495-6f4a-4887-bacc-8082ad9784d4-logs\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.490980 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-scripts\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.491220 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-internal-tls-certs\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.492707 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.495468 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-public-tls-certs\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.498720 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-config-data-custom\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.506433 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgbjp\" (UniqueName: \"kubernetes.io/projected/d1597495-6f4a-4887-bacc-8082ad9784d4-kube-api-access-wgbjp\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.511210 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1597495-6f4a-4887-bacc-8082ad9784d4-config-data\") pod \"manila-api-0\" (UID: \"d1597495-6f4a-4887-bacc-8082ad9784d4\") " pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.660784 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-745b9fcf5d-xkxjq" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.665076 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.685209 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 05 21:10:32 crc kubenswrapper[4753]: I1005 21:10:32.736355 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55b7f7b494-jrzrq"] Oct 05 21:10:33 crc kubenswrapper[4753]: I1005 21:10:33.240128 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-55b7f7b494-jrzrq" podUID="ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" containerName="horizon-log" containerID="cri-o://2fd3ed10b25399b5b6080325002d9ee0926b4851ba516017af7dd927b562f21b" gracePeriod=30 Oct 05 21:10:33 crc kubenswrapper[4753]: I1005 21:10:33.240179 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-55b7f7b494-jrzrq" podUID="ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" containerName="horizon" containerID="cri-o://bd538def4f88bd5b4cb375fe1ce651aa53f18a5f9a12c0076341cfb4f176b206" gracePeriod=30 Oct 05 21:10:33 crc kubenswrapper[4753]: W1005 21:10:33.390086 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1597495_6f4a_4887_bacc_8082ad9784d4.slice/crio-76e43009d0fbce8707452cd1ab98253cef4bf1aeeab26345dc0445360af09ac9 WatchSource:0}: Error finding container 76e43009d0fbce8707452cd1ab98253cef4bf1aeeab26345dc0445360af09ac9: Status 404 returned error can't find the container with id 76e43009d0fbce8707452cd1ab98253cef4bf1aeeab26345dc0445360af09ac9 Oct 05 21:10:33 crc kubenswrapper[4753]: I1005 21:10:33.392711 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 05 21:10:33 crc kubenswrapper[4753]: I1005 21:10:33.863858 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85" path="/var/lib/kubelet/pods/b7a25a47-9ebd-4ed7-bd8d-0dc961f2df85/volumes" Oct 05 21:10:34 crc kubenswrapper[4753]: I1005 21:10:34.260583 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d1597495-6f4a-4887-bacc-8082ad9784d4","Type":"ContainerStarted","Data":"759fd70eff1b471b5429998261eb9dd9716ac1739857f3e82adc18d327e2be95"} Oct 05 21:10:34 crc kubenswrapper[4753]: I1005 21:10:34.261245 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d1597495-6f4a-4887-bacc-8082ad9784d4","Type":"ContainerStarted","Data":"76e43009d0fbce8707452cd1ab98253cef4bf1aeeab26345dc0445360af09ac9"} Oct 05 21:10:35 crc kubenswrapper[4753]: I1005 21:10:35.677868 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 05 21:10:36 crc kubenswrapper[4753]: I1005 21:10:36.186284 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67948f47bf-jnd5v" Oct 05 21:10:36 crc kubenswrapper[4753]: I1005 21:10:36.255819 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bccd47bfc-j5m69"] Oct 05 21:10:36 crc kubenswrapper[4753]: I1005 21:10:36.256099 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" podUID="670bf0ed-f184-4241-b9e7-989781ea4112" containerName="dnsmasq-dns" containerID="cri-o://f7d41b4d7a7921199668ec70cc99775d70275cb49584d84e987bf554ec958c75" gracePeriod=10 Oct 05 21:10:36 crc kubenswrapper[4753]: I1005 21:10:36.344869 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-55b7f7b494-jrzrq" podUID="ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.244:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:54386->10.217.0.244:8443: read: connection reset by peer" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.317950 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.334804 4753 generic.go:334] "Generic (PLEG): container finished" podID="670bf0ed-f184-4241-b9e7-989781ea4112" containerID="f7d41b4d7a7921199668ec70cc99775d70275cb49584d84e987bf554ec958c75" exitCode=0 Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.334854 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" event={"ID":"670bf0ed-f184-4241-b9e7-989781ea4112","Type":"ContainerDied","Data":"f7d41b4d7a7921199668ec70cc99775d70275cb49584d84e987bf554ec958c75"} Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.334884 4753 scope.go:117] "RemoveContainer" containerID="f7d41b4d7a7921199668ec70cc99775d70275cb49584d84e987bf554ec958c75" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.334997 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bccd47bfc-j5m69" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.343204 4753 generic.go:334] "Generic (PLEG): container finished" podID="ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" containerID="bd538def4f88bd5b4cb375fe1ce651aa53f18a5f9a12c0076341cfb4f176b206" exitCode=0 Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.343237 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55b7f7b494-jrzrq" event={"ID":"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac","Type":"ContainerDied","Data":"bd538def4f88bd5b4cb375fe1ce651aa53f18a5f9a12c0076341cfb4f176b206"} Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.385716 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nszgm\" (UniqueName: \"kubernetes.io/projected/670bf0ed-f184-4241-b9e7-989781ea4112-kube-api-access-nszgm\") pod \"670bf0ed-f184-4241-b9e7-989781ea4112\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.385772 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-openstack-edpm-ipam\") pod \"670bf0ed-f184-4241-b9e7-989781ea4112\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.385803 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-ovsdbserver-sb\") pod \"670bf0ed-f184-4241-b9e7-989781ea4112\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.385976 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-config\") pod \"670bf0ed-f184-4241-b9e7-989781ea4112\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.386430 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-ovsdbserver-nb\") pod \"670bf0ed-f184-4241-b9e7-989781ea4112\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.386540 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-dns-svc\") pod \"670bf0ed-f184-4241-b9e7-989781ea4112\" (UID: \"670bf0ed-f184-4241-b9e7-989781ea4112\") " Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.400478 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/670bf0ed-f184-4241-b9e7-989781ea4112-kube-api-access-nszgm" (OuterVolumeSpecName: "kube-api-access-nszgm") pod "670bf0ed-f184-4241-b9e7-989781ea4112" (UID: "670bf0ed-f184-4241-b9e7-989781ea4112"). InnerVolumeSpecName "kube-api-access-nszgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.430360 4753 scope.go:117] "RemoveContainer" containerID="286af67d9e12ad2e5ed379526fbc273e45ffbb62d17a76dbcb0aa6602454d19b" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.440775 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "670bf0ed-f184-4241-b9e7-989781ea4112" (UID: "670bf0ed-f184-4241-b9e7-989781ea4112"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.490376 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nszgm\" (UniqueName: \"kubernetes.io/projected/670bf0ed-f184-4241-b9e7-989781ea4112-kube-api-access-nszgm\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.490407 4753 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.508488 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "670bf0ed-f184-4241-b9e7-989781ea4112" (UID: "670bf0ed-f184-4241-b9e7-989781ea4112"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.515071 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-config" (OuterVolumeSpecName: "config") pod "670bf0ed-f184-4241-b9e7-989781ea4112" (UID: "670bf0ed-f184-4241-b9e7-989781ea4112"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.523063 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "670bf0ed-f184-4241-b9e7-989781ea4112" (UID: "670bf0ed-f184-4241-b9e7-989781ea4112"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.539466 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "670bf0ed-f184-4241-b9e7-989781ea4112" (UID: "670bf0ed-f184-4241-b9e7-989781ea4112"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.592750 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-config\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.592789 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.592800 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.592808 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/670bf0ed-f184-4241-b9e7-989781ea4112-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.683906 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bccd47bfc-j5m69"] Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.697825 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bccd47bfc-j5m69"] Oct 05 21:10:37 crc kubenswrapper[4753]: I1005 21:10:37.870307 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="670bf0ed-f184-4241-b9e7-989781ea4112" path="/var/lib/kubelet/pods/670bf0ed-f184-4241-b9e7-989781ea4112/volumes" Oct 05 21:10:38 crc kubenswrapper[4753]: I1005 21:10:38.353086 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"07d7a9f9-7e60-4c3f-a87b-480299693d92","Type":"ContainerStarted","Data":"b1499066a2f6f14c1b9e94baad00c8192ff51c0e7adf21bd41431fdc24ebd7f6"} Oct 05 21:10:38 crc kubenswrapper[4753]: I1005 21:10:38.353437 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"07d7a9f9-7e60-4c3f-a87b-480299693d92","Type":"ContainerStarted","Data":"0d911b9d5c07459b55760e6dad72371caa1698a5e45a7485365b02ad5a5779a0"} Oct 05 21:10:38 crc kubenswrapper[4753]: I1005 21:10:38.357250 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d1597495-6f4a-4887-bacc-8082ad9784d4","Type":"ContainerStarted","Data":"5608b3354300526da8b9cc5563375ed8392f319044df438678790ad40f3cba32"} Oct 05 21:10:38 crc kubenswrapper[4753]: I1005 21:10:38.357471 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 05 21:10:38 crc kubenswrapper[4753]: I1005 21:10:38.376798 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.127159267 podStartE2EDuration="13.376783057s" podCreationTimestamp="2025-10-05 21:10:25 +0000 UTC" firstStartedPulling="2025-10-05 21:10:26.848405632 +0000 UTC m=+3335.696733864" lastFinishedPulling="2025-10-05 21:10:37.098029422 +0000 UTC m=+3345.946357654" observedRunningTime="2025-10-05 21:10:38.371227485 +0000 UTC m=+3347.219555717" watchObservedRunningTime="2025-10-05 21:10:38.376783057 +0000 UTC m=+3347.225111289" Oct 05 21:10:38 crc kubenswrapper[4753]: I1005 21:10:38.401163 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=6.401149809 podStartE2EDuration="6.401149809s" podCreationTimestamp="2025-10-05 21:10:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 21:10:38.398734904 +0000 UTC m=+3347.247063136" watchObservedRunningTime="2025-10-05 21:10:38.401149809 +0000 UTC m=+3347.249478041" Oct 05 21:10:39 crc kubenswrapper[4753]: I1005 21:10:39.584938 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 05 21:10:39 crc kubenswrapper[4753]: I1005 21:10:39.585736 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerName="ceilometer-central-agent" containerID="cri-o://422ccff053fe6a7167f44481ef576ba9d402cc60a51f0c18fa2b2888805db045" gracePeriod=30 Oct 05 21:10:39 crc kubenswrapper[4753]: I1005 21:10:39.586310 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerName="sg-core" containerID="cri-o://a7ec303068f89f1d49f0a5667920911c10d5c5c7d4419d9c47146e9fb2952ca6" gracePeriod=30 Oct 05 21:10:39 crc kubenswrapper[4753]: I1005 21:10:39.586319 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerName="proxy-httpd" containerID="cri-o://8e2d9dada88a876eb4e5d5885fbd9a554538e38f4c502ecba524608791bb8253" gracePeriod=30 Oct 05 21:10:39 crc kubenswrapper[4753]: I1005 21:10:39.586416 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerName="ceilometer-notification-agent" containerID="cri-o://7aaa8ebf2ca481f8bcd382dcb6b0fcc15d54dec9d2a359984262ae9283c3bf9d" gracePeriod=30 Oct 05 21:10:40 crc kubenswrapper[4753]: I1005 21:10:40.387100 4753 generic.go:334] "Generic (PLEG): container finished" podID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerID="8e2d9dada88a876eb4e5d5885fbd9a554538e38f4c502ecba524608791bb8253" exitCode=0 Oct 05 21:10:40 crc kubenswrapper[4753]: I1005 21:10:40.387153 4753 generic.go:334] "Generic (PLEG): container finished" podID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerID="a7ec303068f89f1d49f0a5667920911c10d5c5c7d4419d9c47146e9fb2952ca6" exitCode=2 Oct 05 21:10:40 crc kubenswrapper[4753]: I1005 21:10:40.387163 4753 generic.go:334] "Generic (PLEG): container finished" podID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerID="422ccff053fe6a7167f44481ef576ba9d402cc60a51f0c18fa2b2888805db045" exitCode=0 Oct 05 21:10:40 crc kubenswrapper[4753]: I1005 21:10:40.387183 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de29aa2f-3662-4b9b-a32d-f6e0da626cb6","Type":"ContainerDied","Data":"8e2d9dada88a876eb4e5d5885fbd9a554538e38f4c502ecba524608791bb8253"} Oct 05 21:10:40 crc kubenswrapper[4753]: I1005 21:10:40.387207 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de29aa2f-3662-4b9b-a32d-f6e0da626cb6","Type":"ContainerDied","Data":"a7ec303068f89f1d49f0a5667920911c10d5c5c7d4419d9c47146e9fb2952ca6"} Oct 05 21:10:40 crc kubenswrapper[4753]: I1005 21:10:40.387218 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de29aa2f-3662-4b9b-a32d-f6e0da626cb6","Type":"ContainerDied","Data":"422ccff053fe6a7167f44481ef576ba9d402cc60a51f0c18fa2b2888805db045"} Oct 05 21:10:45 crc kubenswrapper[4753]: I1005 21:10:45.316499 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-55b7f7b494-jrzrq" podUID="ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.244:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.244:8443: connect: connection refused" Oct 05 21:10:45 crc kubenswrapper[4753]: I1005 21:10:45.670891 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 05 21:10:47 crc kubenswrapper[4753]: I1005 21:10:47.299835 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 05 21:10:47 crc kubenswrapper[4753]: I1005 21:10:47.346428 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 05 21:10:47 crc kubenswrapper[4753]: I1005 21:10:47.466741 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="e054bfac-2418-40f6-9744-3e490a616e70" containerName="manila-scheduler" containerID="cri-o://cfd96578c8aa2bb625423365c4ecd52e9795850174928395e6f2e088e0190c73" gracePeriod=30 Oct 05 21:10:47 crc kubenswrapper[4753]: I1005 21:10:47.466807 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="e054bfac-2418-40f6-9744-3e490a616e70" containerName="probe" containerID="cri-o://4c91d52ec12a3193146c1d03144a6efb0904a8040935097cd50f721bfb459a3c" gracePeriod=30 Oct 05 21:10:48 crc kubenswrapper[4753]: I1005 21:10:48.479525 4753 generic.go:334] "Generic (PLEG): container finished" podID="e054bfac-2418-40f6-9744-3e490a616e70" containerID="4c91d52ec12a3193146c1d03144a6efb0904a8040935097cd50f721bfb459a3c" exitCode=0 Oct 05 21:10:48 crc kubenswrapper[4753]: I1005 21:10:48.479626 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e054bfac-2418-40f6-9744-3e490a616e70","Type":"ContainerDied","Data":"4c91d52ec12a3193146c1d03144a6efb0904a8040935097cd50f721bfb459a3c"} Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.493358 4753 generic.go:334] "Generic (PLEG): container finished" podID="e054bfac-2418-40f6-9744-3e490a616e70" containerID="cfd96578c8aa2bb625423365c4ecd52e9795850174928395e6f2e088e0190c73" exitCode=0 Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.493678 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e054bfac-2418-40f6-9744-3e490a616e70","Type":"ContainerDied","Data":"cfd96578c8aa2bb625423365c4ecd52e9795850174928395e6f2e088e0190c73"} Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.493710 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e054bfac-2418-40f6-9744-3e490a616e70","Type":"ContainerDied","Data":"e2a2b20d11461caf35bbfcdd486d7660b8c7eac6a5f55c4f4f13e4812308b873"} Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.493724 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2a2b20d11461caf35bbfcdd486d7660b8c7eac6a5f55c4f4f13e4812308b873" Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.564693 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.741898 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-config-data\") pod \"e054bfac-2418-40f6-9744-3e490a616e70\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.741982 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-scripts\") pod \"e054bfac-2418-40f6-9744-3e490a616e70\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.742075 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e054bfac-2418-40f6-9744-3e490a616e70-etc-machine-id\") pod \"e054bfac-2418-40f6-9744-3e490a616e70\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.742130 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-combined-ca-bundle\") pod \"e054bfac-2418-40f6-9744-3e490a616e70\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.742186 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np6mh\" (UniqueName: \"kubernetes.io/projected/e054bfac-2418-40f6-9744-3e490a616e70-kube-api-access-np6mh\") pod \"e054bfac-2418-40f6-9744-3e490a616e70\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.742274 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-config-data-custom\") pod \"e054bfac-2418-40f6-9744-3e490a616e70\" (UID: \"e054bfac-2418-40f6-9744-3e490a616e70\") " Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.742635 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e054bfac-2418-40f6-9744-3e490a616e70-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e054bfac-2418-40f6-9744-3e490a616e70" (UID: "e054bfac-2418-40f6-9744-3e490a616e70"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.742863 4753 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e054bfac-2418-40f6-9744-3e490a616e70-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.761519 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e054bfac-2418-40f6-9744-3e490a616e70-kube-api-access-np6mh" (OuterVolumeSpecName: "kube-api-access-np6mh") pod "e054bfac-2418-40f6-9744-3e490a616e70" (UID: "e054bfac-2418-40f6-9744-3e490a616e70"). InnerVolumeSpecName "kube-api-access-np6mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.762710 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-scripts" (OuterVolumeSpecName: "scripts") pod "e054bfac-2418-40f6-9744-3e490a616e70" (UID: "e054bfac-2418-40f6-9744-3e490a616e70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.774844 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e054bfac-2418-40f6-9744-3e490a616e70" (UID: "e054bfac-2418-40f6-9744-3e490a616e70"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.821693 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e054bfac-2418-40f6-9744-3e490a616e70" (UID: "e054bfac-2418-40f6-9744-3e490a616e70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.844840 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.844870 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np6mh\" (UniqueName: \"kubernetes.io/projected/e054bfac-2418-40f6-9744-3e490a616e70-kube-api-access-np6mh\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.844881 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.844891 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.870758 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.894383 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-config-data" (OuterVolumeSpecName: "config-data") pod "e054bfac-2418-40f6-9744-3e490a616e70" (UID: "e054bfac-2418-40f6-9744-3e490a616e70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:49 crc kubenswrapper[4753]: I1005 21:10:49.947126 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e054bfac-2418-40f6-9744-3e490a616e70-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.048958 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-log-httpd\") pod \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.048996 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-ceilometer-tls-certs\") pod \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.049058 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m25hk\" (UniqueName: \"kubernetes.io/projected/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-kube-api-access-m25hk\") pod \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.049089 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-combined-ca-bundle\") pod \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.049115 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-config-data\") pod \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.049160 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-sg-core-conf-yaml\") pod \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.049188 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-scripts\") pod \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.049219 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-run-httpd\") pod \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\" (UID: \"de29aa2f-3662-4b9b-a32d-f6e0da626cb6\") " Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.051241 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "de29aa2f-3662-4b9b-a32d-f6e0da626cb6" (UID: "de29aa2f-3662-4b9b-a32d-f6e0da626cb6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.051289 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "de29aa2f-3662-4b9b-a32d-f6e0da626cb6" (UID: "de29aa2f-3662-4b9b-a32d-f6e0da626cb6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.056074 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-scripts" (OuterVolumeSpecName: "scripts") pod "de29aa2f-3662-4b9b-a32d-f6e0da626cb6" (UID: "de29aa2f-3662-4b9b-a32d-f6e0da626cb6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.056639 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-kube-api-access-m25hk" (OuterVolumeSpecName: "kube-api-access-m25hk") pod "de29aa2f-3662-4b9b-a32d-f6e0da626cb6" (UID: "de29aa2f-3662-4b9b-a32d-f6e0da626cb6"). InnerVolumeSpecName "kube-api-access-m25hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.080627 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "de29aa2f-3662-4b9b-a32d-f6e0da626cb6" (UID: "de29aa2f-3662-4b9b-a32d-f6e0da626cb6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.122125 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "de29aa2f-3662-4b9b-a32d-f6e0da626cb6" (UID: "de29aa2f-3662-4b9b-a32d-f6e0da626cb6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.146005 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de29aa2f-3662-4b9b-a32d-f6e0da626cb6" (UID: "de29aa2f-3662-4b9b-a32d-f6e0da626cb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.152380 4753 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.152410 4753 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.152423 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m25hk\" (UniqueName: \"kubernetes.io/projected/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-kube-api-access-m25hk\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.152431 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.152440 4753 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.152447 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.152455 4753 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.154025 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-config-data" (OuterVolumeSpecName: "config-data") pod "de29aa2f-3662-4b9b-a32d-f6e0da626cb6" (UID: "de29aa2f-3662-4b9b-a32d-f6e0da626cb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.254490 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de29aa2f-3662-4b9b-a32d-f6e0da626cb6-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.504126 4753 generic.go:334] "Generic (PLEG): container finished" podID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerID="7aaa8ebf2ca481f8bcd382dcb6b0fcc15d54dec9d2a359984262ae9283c3bf9d" exitCode=0 Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.504187 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.504254 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.504285 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de29aa2f-3662-4b9b-a32d-f6e0da626cb6","Type":"ContainerDied","Data":"7aaa8ebf2ca481f8bcd382dcb6b0fcc15d54dec9d2a359984262ae9283c3bf9d"} Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.504315 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de29aa2f-3662-4b9b-a32d-f6e0da626cb6","Type":"ContainerDied","Data":"41a928f4e1c3e65ac5a846fb29d04a09145e0d3c7f72a3231d5a8e9985030b2e"} Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.504337 4753 scope.go:117] "RemoveContainer" containerID="8e2d9dada88a876eb4e5d5885fbd9a554538e38f4c502ecba524608791bb8253" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.545327 4753 scope.go:117] "RemoveContainer" containerID="a7ec303068f89f1d49f0a5667920911c10d5c5c7d4419d9c47146e9fb2952ca6" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.551937 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.581055 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.593483 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.600426 4753 scope.go:117] "RemoveContainer" containerID="7aaa8ebf2ca481f8bcd382dcb6b0fcc15d54dec9d2a359984262ae9283c3bf9d" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.614054 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.623397 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 05 21:10:50 crc kubenswrapper[4753]: E1005 21:10:50.624070 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="670bf0ed-f184-4241-b9e7-989781ea4112" containerName="dnsmasq-dns" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.624216 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="670bf0ed-f184-4241-b9e7-989781ea4112" containerName="dnsmasq-dns" Oct 05 21:10:50 crc kubenswrapper[4753]: E1005 21:10:50.624332 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerName="sg-core" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.624409 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerName="sg-core" Oct 05 21:10:50 crc kubenswrapper[4753]: E1005 21:10:50.624488 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerName="ceilometer-notification-agent" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.624563 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerName="ceilometer-notification-agent" Oct 05 21:10:50 crc kubenswrapper[4753]: E1005 21:10:50.624648 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e054bfac-2418-40f6-9744-3e490a616e70" containerName="probe" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.624722 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e054bfac-2418-40f6-9744-3e490a616e70" containerName="probe" Oct 05 21:10:50 crc kubenswrapper[4753]: E1005 21:10:50.624793 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerName="ceilometer-central-agent" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.624866 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerName="ceilometer-central-agent" Oct 05 21:10:50 crc kubenswrapper[4753]: E1005 21:10:50.624943 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerName="proxy-httpd" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.625008 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerName="proxy-httpd" Oct 05 21:10:50 crc kubenswrapper[4753]: E1005 21:10:50.625099 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e054bfac-2418-40f6-9744-3e490a616e70" containerName="manila-scheduler" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.625189 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e054bfac-2418-40f6-9744-3e490a616e70" containerName="manila-scheduler" Oct 05 21:10:50 crc kubenswrapper[4753]: E1005 21:10:50.625264 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="670bf0ed-f184-4241-b9e7-989781ea4112" containerName="init" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.625368 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="670bf0ed-f184-4241-b9e7-989781ea4112" containerName="init" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.625685 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="e054bfac-2418-40f6-9744-3e490a616e70" containerName="probe" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.625787 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerName="ceilometer-central-agent" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.625887 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerName="proxy-httpd" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.625956 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="e054bfac-2418-40f6-9744-3e490a616e70" containerName="manila-scheduler" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.626025 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="670bf0ed-f184-4241-b9e7-989781ea4112" containerName="dnsmasq-dns" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.626099 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerName="ceilometer-notification-agent" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.626197 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" containerName="sg-core" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.628334 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.630449 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.631769 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.632264 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.634187 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.637165 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.638437 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.642833 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.670340 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.671873 4753 scope.go:117] "RemoveContainer" containerID="422ccff053fe6a7167f44481ef576ba9d402cc60a51f0c18fa2b2888805db045" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.759122 4753 scope.go:117] "RemoveContainer" containerID="8e2d9dada88a876eb4e5d5885fbd9a554538e38f4c502ecba524608791bb8253" Oct 05 21:10:50 crc kubenswrapper[4753]: E1005 21:10:50.759471 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e2d9dada88a876eb4e5d5885fbd9a554538e38f4c502ecba524608791bb8253\": container with ID starting with 8e2d9dada88a876eb4e5d5885fbd9a554538e38f4c502ecba524608791bb8253 not found: ID does not exist" containerID="8e2d9dada88a876eb4e5d5885fbd9a554538e38f4c502ecba524608791bb8253" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.759516 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e2d9dada88a876eb4e5d5885fbd9a554538e38f4c502ecba524608791bb8253"} err="failed to get container status \"8e2d9dada88a876eb4e5d5885fbd9a554538e38f4c502ecba524608791bb8253\": rpc error: code = NotFound desc = could not find container \"8e2d9dada88a876eb4e5d5885fbd9a554538e38f4c502ecba524608791bb8253\": container with ID starting with 8e2d9dada88a876eb4e5d5885fbd9a554538e38f4c502ecba524608791bb8253 not found: ID does not exist" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.759536 4753 scope.go:117] "RemoveContainer" containerID="a7ec303068f89f1d49f0a5667920911c10d5c5c7d4419d9c47146e9fb2952ca6" Oct 05 21:10:50 crc kubenswrapper[4753]: E1005 21:10:50.759735 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7ec303068f89f1d49f0a5667920911c10d5c5c7d4419d9c47146e9fb2952ca6\": container with ID starting with a7ec303068f89f1d49f0a5667920911c10d5c5c7d4419d9c47146e9fb2952ca6 not found: ID does not exist" containerID="a7ec303068f89f1d49f0a5667920911c10d5c5c7d4419d9c47146e9fb2952ca6" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.759771 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ec303068f89f1d49f0a5667920911c10d5c5c7d4419d9c47146e9fb2952ca6"} err="failed to get container status \"a7ec303068f89f1d49f0a5667920911c10d5c5c7d4419d9c47146e9fb2952ca6\": rpc error: code = NotFound desc = could not find container \"a7ec303068f89f1d49f0a5667920911c10d5c5c7d4419d9c47146e9fb2952ca6\": container with ID starting with a7ec303068f89f1d49f0a5667920911c10d5c5c7d4419d9c47146e9fb2952ca6 not found: ID does not exist" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.759785 4753 scope.go:117] "RemoveContainer" containerID="7aaa8ebf2ca481f8bcd382dcb6b0fcc15d54dec9d2a359984262ae9283c3bf9d" Oct 05 21:10:50 crc kubenswrapper[4753]: E1005 21:10:50.759961 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aaa8ebf2ca481f8bcd382dcb6b0fcc15d54dec9d2a359984262ae9283c3bf9d\": container with ID starting with 7aaa8ebf2ca481f8bcd382dcb6b0fcc15d54dec9d2a359984262ae9283c3bf9d not found: ID does not exist" containerID="7aaa8ebf2ca481f8bcd382dcb6b0fcc15d54dec9d2a359984262ae9283c3bf9d" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.759980 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aaa8ebf2ca481f8bcd382dcb6b0fcc15d54dec9d2a359984262ae9283c3bf9d"} err="failed to get container status \"7aaa8ebf2ca481f8bcd382dcb6b0fcc15d54dec9d2a359984262ae9283c3bf9d\": rpc error: code = NotFound desc = could not find container \"7aaa8ebf2ca481f8bcd382dcb6b0fcc15d54dec9d2a359984262ae9283c3bf9d\": container with ID starting with 7aaa8ebf2ca481f8bcd382dcb6b0fcc15d54dec9d2a359984262ae9283c3bf9d not found: ID does not exist" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.760010 4753 scope.go:117] "RemoveContainer" containerID="422ccff053fe6a7167f44481ef576ba9d402cc60a51f0c18fa2b2888805db045" Oct 05 21:10:50 crc kubenswrapper[4753]: E1005 21:10:50.760236 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"422ccff053fe6a7167f44481ef576ba9d402cc60a51f0c18fa2b2888805db045\": container with ID starting with 422ccff053fe6a7167f44481ef576ba9d402cc60a51f0c18fa2b2888805db045 not found: ID does not exist" containerID="422ccff053fe6a7167f44481ef576ba9d402cc60a51f0c18fa2b2888805db045" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.760258 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422ccff053fe6a7167f44481ef576ba9d402cc60a51f0c18fa2b2888805db045"} err="failed to get container status \"422ccff053fe6a7167f44481ef576ba9d402cc60a51f0c18fa2b2888805db045\": rpc error: code = NotFound desc = could not find container \"422ccff053fe6a7167f44481ef576ba9d402cc60a51f0c18fa2b2888805db045\": container with ID starting with 422ccff053fe6a7167f44481ef576ba9d402cc60a51f0c18fa2b2888805db045 not found: ID does not exist" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.772759 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.772827 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68a4d310-a272-4033-af72-dfc6e8c239f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.772882 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68a4d310-a272-4033-af72-dfc6e8c239f6-scripts\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.772897 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a4d310-a272-4033-af72-dfc6e8c239f6-config-data\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.772914 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a4d310-a272-4033-af72-dfc6e8c239f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.772970 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79dvl\" (UniqueName: \"kubernetes.io/projected/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-kube-api-access-79dvl\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.772994 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68a4d310-a272-4033-af72-dfc6e8c239f6-log-httpd\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.773015 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68a4d310-a272-4033-af72-dfc6e8c239f6-run-httpd\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.773033 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-config-data\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.773057 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-scripts\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.773082 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/68a4d310-a272-4033-af72-dfc6e8c239f6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.773112 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.773153 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qvs4\" (UniqueName: \"kubernetes.io/projected/68a4d310-a272-4033-af72-dfc6e8c239f6-kube-api-access-5qvs4\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.773171 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.874351 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68a4d310-a272-4033-af72-dfc6e8c239f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.874405 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68a4d310-a272-4033-af72-dfc6e8c239f6-scripts\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.874422 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a4d310-a272-4033-af72-dfc6e8c239f6-config-data\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.874438 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a4d310-a272-4033-af72-dfc6e8c239f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.874492 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79dvl\" (UniqueName: \"kubernetes.io/projected/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-kube-api-access-79dvl\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.874514 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68a4d310-a272-4033-af72-dfc6e8c239f6-log-httpd\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.874533 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68a4d310-a272-4033-af72-dfc6e8c239f6-run-httpd\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.874550 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-config-data\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.874572 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-scripts\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.874597 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/68a4d310-a272-4033-af72-dfc6e8c239f6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.874647 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.874665 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qvs4\" (UniqueName: \"kubernetes.io/projected/68a4d310-a272-4033-af72-dfc6e8c239f6-kube-api-access-5qvs4\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.874685 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.874726 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.875513 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.875846 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68a4d310-a272-4033-af72-dfc6e8c239f6-log-httpd\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.876026 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68a4d310-a272-4033-af72-dfc6e8c239f6-run-httpd\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.887007 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-config-data\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.889000 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68a4d310-a272-4033-af72-dfc6e8c239f6-scripts\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.889578 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-scripts\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.890248 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/68a4d310-a272-4033-af72-dfc6e8c239f6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.890437 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.890603 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68a4d310-a272-4033-af72-dfc6e8c239f6-config-data\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.892598 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.895829 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a4d310-a272-4033-af72-dfc6e8c239f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.897577 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68a4d310-a272-4033-af72-dfc6e8c239f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.899804 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qvs4\" (UniqueName: \"kubernetes.io/projected/68a4d310-a272-4033-af72-dfc6e8c239f6-kube-api-access-5qvs4\") pod \"ceilometer-0\" (UID: \"68a4d310-a272-4033-af72-dfc6e8c239f6\") " pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.913838 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79dvl\" (UniqueName: \"kubernetes.io/projected/7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a-kube-api-access-79dvl\") pod \"manila-scheduler-0\" (UID: \"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a\") " pod="openstack/manila-scheduler-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.977354 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 05 21:10:50 crc kubenswrapper[4753]: I1005 21:10:50.994787 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 05 21:10:51 crc kubenswrapper[4753]: I1005 21:10:51.444855 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 05 21:10:51 crc kubenswrapper[4753]: I1005 21:10:51.494215 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 05 21:10:51 crc kubenswrapper[4753]: W1005 21:10:51.499295 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ceefd75_1ba6_4518_8903_c9ac6c2d8e6a.slice/crio-0357f9af3a2a042ade2cc79c6a78e5ba2f355928f2a5776ef0fce47d0110d90f WatchSource:0}: Error finding container 0357f9af3a2a042ade2cc79c6a78e5ba2f355928f2a5776ef0fce47d0110d90f: Status 404 returned error can't find the container with id 0357f9af3a2a042ade2cc79c6a78e5ba2f355928f2a5776ef0fce47d0110d90f Oct 05 21:10:51 crc kubenswrapper[4753]: I1005 21:10:51.516018 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a","Type":"ContainerStarted","Data":"0357f9af3a2a042ade2cc79c6a78e5ba2f355928f2a5776ef0fce47d0110d90f"} Oct 05 21:10:51 crc kubenswrapper[4753]: I1005 21:10:51.517510 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68a4d310-a272-4033-af72-dfc6e8c239f6","Type":"ContainerStarted","Data":"03d833b68ddfb444ca59a0b70245a320cc872d2a38db032fe6f0396ca504d5c4"} Oct 05 21:10:51 crc kubenswrapper[4753]: I1005 21:10:51.899728 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de29aa2f-3662-4b9b-a32d-f6e0da626cb6" path="/var/lib/kubelet/pods/de29aa2f-3662-4b9b-a32d-f6e0da626cb6/volumes" Oct 05 21:10:51 crc kubenswrapper[4753]: I1005 21:10:51.900947 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e054bfac-2418-40f6-9744-3e490a616e70" path="/var/lib/kubelet/pods/e054bfac-2418-40f6-9744-3e490a616e70/volumes" Oct 05 21:10:52 crc kubenswrapper[4753]: I1005 21:10:52.528631 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a","Type":"ContainerStarted","Data":"820b0df16ae97606641e131197e54e105e794b4c3d176ff9974078359d3d06f8"} Oct 05 21:10:52 crc kubenswrapper[4753]: I1005 21:10:52.530121 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68a4d310-a272-4033-af72-dfc6e8c239f6","Type":"ContainerStarted","Data":"715e14e73b00000fdd345a9d79e6d92f3659e9433e3b9037e7e95dabc5b24871"} Oct 05 21:10:53 crc kubenswrapper[4753]: I1005 21:10:53.552036 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a","Type":"ContainerStarted","Data":"ada9ec9d81145d284c11e4634f4c9a7dd1f3d0497e640f76594d93577facbfb3"} Oct 05 21:10:53 crc kubenswrapper[4753]: I1005 21:10:53.575449 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68a4d310-a272-4033-af72-dfc6e8c239f6","Type":"ContainerStarted","Data":"ab1744c7705e0836ccee1d4c4cc643aba77add0a47b08a4c45b06009f5d92d71"} Oct 05 21:10:53 crc kubenswrapper[4753]: I1005 21:10:53.580778 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.58076236 podStartE2EDuration="3.58076236s" podCreationTimestamp="2025-10-05 21:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 21:10:53.579719808 +0000 UTC m=+3362.428048040" watchObservedRunningTime="2025-10-05 21:10:53.58076236 +0000 UTC m=+3362.429090582" Oct 05 21:10:54 crc kubenswrapper[4753]: I1005 21:10:54.585539 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68a4d310-a272-4033-af72-dfc6e8c239f6","Type":"ContainerStarted","Data":"bfb020831a76b9cdd1a00098ef1cc4f5ebd2765121c80415ceb21999a633b201"} Oct 05 21:10:54 crc kubenswrapper[4753]: I1005 21:10:54.906814 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 05 21:10:55 crc kubenswrapper[4753]: I1005 21:10:55.315784 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-55b7f7b494-jrzrq" podUID="ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.244:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.244:8443: connect: connection refused" Oct 05 21:10:55 crc kubenswrapper[4753]: I1005 21:10:55.316268 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:10:55 crc kubenswrapper[4753]: I1005 21:10:55.595486 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68a4d310-a272-4033-af72-dfc6e8c239f6","Type":"ContainerStarted","Data":"26c6c1d95ad2c5d5c628b3cbeb340b2b64264cd9bdd7fb868e83dae168532c2f"} Oct 05 21:10:55 crc kubenswrapper[4753]: I1005 21:10:55.595703 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 05 21:10:55 crc kubenswrapper[4753]: I1005 21:10:55.618779 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.053629274 podStartE2EDuration="5.618763684s" podCreationTimestamp="2025-10-05 21:10:50 +0000 UTC" firstStartedPulling="2025-10-05 21:10:51.449650893 +0000 UTC m=+3360.297979125" lastFinishedPulling="2025-10-05 21:10:55.014785303 +0000 UTC m=+3363.863113535" observedRunningTime="2025-10-05 21:10:55.616459413 +0000 UTC m=+3364.464787665" watchObservedRunningTime="2025-10-05 21:10:55.618763684 +0000 UTC m=+3364.467091916" Oct 05 21:10:57 crc kubenswrapper[4753]: I1005 21:10:57.320417 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 05 21:10:57 crc kubenswrapper[4753]: I1005 21:10:57.380039 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 05 21:10:57 crc kubenswrapper[4753]: I1005 21:10:57.609904 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="07d7a9f9-7e60-4c3f-a87b-480299693d92" containerName="manila-share" containerID="cri-o://0d911b9d5c07459b55760e6dad72371caa1698a5e45a7485365b02ad5a5779a0" gracePeriod=30 Oct 05 21:10:57 crc kubenswrapper[4753]: I1005 21:10:57.609974 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="07d7a9f9-7e60-4c3f-a87b-480299693d92" containerName="probe" containerID="cri-o://b1499066a2f6f14c1b9e94baad00c8192ff51c0e7adf21bd41431fdc24ebd7f6" gracePeriod=30 Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.440968 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mmw2v"] Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.452475 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.479043 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mmw2v"] Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.537539 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3836426-5f36-4383-a406-a2dafcf7d424-utilities\") pod \"community-operators-mmw2v\" (UID: \"d3836426-5f36-4383-a406-a2dafcf7d424\") " pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.537944 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6wf2\" (UniqueName: \"kubernetes.io/projected/d3836426-5f36-4383-a406-a2dafcf7d424-kube-api-access-w6wf2\") pod \"community-operators-mmw2v\" (UID: \"d3836426-5f36-4383-a406-a2dafcf7d424\") " pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.538065 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3836426-5f36-4383-a406-a2dafcf7d424-catalog-content\") pod \"community-operators-mmw2v\" (UID: \"d3836426-5f36-4383-a406-a2dafcf7d424\") " pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.633308 4753 generic.go:334] "Generic (PLEG): container finished" podID="07d7a9f9-7e60-4c3f-a87b-480299693d92" containerID="b1499066a2f6f14c1b9e94baad00c8192ff51c0e7adf21bd41431fdc24ebd7f6" exitCode=0 Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.633345 4753 generic.go:334] "Generic (PLEG): container finished" podID="07d7a9f9-7e60-4c3f-a87b-480299693d92" containerID="0d911b9d5c07459b55760e6dad72371caa1698a5e45a7485365b02ad5a5779a0" exitCode=1 Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.633368 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"07d7a9f9-7e60-4c3f-a87b-480299693d92","Type":"ContainerDied","Data":"b1499066a2f6f14c1b9e94baad00c8192ff51c0e7adf21bd41431fdc24ebd7f6"} Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.633396 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"07d7a9f9-7e60-4c3f-a87b-480299693d92","Type":"ContainerDied","Data":"0d911b9d5c07459b55760e6dad72371caa1698a5e45a7485365b02ad5a5779a0"} Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.633408 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"07d7a9f9-7e60-4c3f-a87b-480299693d92","Type":"ContainerDied","Data":"e5d4bb7abe653f30a420905fc8e26b2976301848a6733d7d0bf82a7e1d5949a8"} Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.633419 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5d4bb7abe653f30a420905fc8e26b2976301848a6733d7d0bf82a7e1d5949a8" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.639639 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3836426-5f36-4383-a406-a2dafcf7d424-utilities\") pod \"community-operators-mmw2v\" (UID: \"d3836426-5f36-4383-a406-a2dafcf7d424\") " pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.639725 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6wf2\" (UniqueName: \"kubernetes.io/projected/d3836426-5f36-4383-a406-a2dafcf7d424-kube-api-access-w6wf2\") pod \"community-operators-mmw2v\" (UID: \"d3836426-5f36-4383-a406-a2dafcf7d424\") " pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.639773 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3836426-5f36-4383-a406-a2dafcf7d424-catalog-content\") pod \"community-operators-mmw2v\" (UID: \"d3836426-5f36-4383-a406-a2dafcf7d424\") " pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.640359 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3836426-5f36-4383-a406-a2dafcf7d424-catalog-content\") pod \"community-operators-mmw2v\" (UID: \"d3836426-5f36-4383-a406-a2dafcf7d424\") " pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.640493 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3836426-5f36-4383-a406-a2dafcf7d424-utilities\") pod \"community-operators-mmw2v\" (UID: \"d3836426-5f36-4383-a406-a2dafcf7d424\") " pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.642684 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.672275 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6wf2\" (UniqueName: \"kubernetes.io/projected/d3836426-5f36-4383-a406-a2dafcf7d424-kube-api-access-w6wf2\") pod \"community-operators-mmw2v\" (UID: \"d3836426-5f36-4383-a406-a2dafcf7d424\") " pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.740760 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-scripts\") pod \"07d7a9f9-7e60-4c3f-a87b-480299693d92\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.740804 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/07d7a9f9-7e60-4c3f-a87b-480299693d92-ceph\") pod \"07d7a9f9-7e60-4c3f-a87b-480299693d92\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.740833 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/07d7a9f9-7e60-4c3f-a87b-480299693d92-var-lib-manila\") pod \"07d7a9f9-7e60-4c3f-a87b-480299693d92\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.740890 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-config-data-custom\") pod \"07d7a9f9-7e60-4c3f-a87b-480299693d92\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.740906 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-config-data\") pod \"07d7a9f9-7e60-4c3f-a87b-480299693d92\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.740941 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-combined-ca-bundle\") pod \"07d7a9f9-7e60-4c3f-a87b-480299693d92\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.740965 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26sr4\" (UniqueName: \"kubernetes.io/projected/07d7a9f9-7e60-4c3f-a87b-480299693d92-kube-api-access-26sr4\") pod \"07d7a9f9-7e60-4c3f-a87b-480299693d92\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.741087 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07d7a9f9-7e60-4c3f-a87b-480299693d92-etc-machine-id\") pod \"07d7a9f9-7e60-4c3f-a87b-480299693d92\" (UID: \"07d7a9f9-7e60-4c3f-a87b-480299693d92\") " Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.741510 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07d7a9f9-7e60-4c3f-a87b-480299693d92-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "07d7a9f9-7e60-4c3f-a87b-480299693d92" (UID: "07d7a9f9-7e60-4c3f-a87b-480299693d92"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.742152 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07d7a9f9-7e60-4c3f-a87b-480299693d92-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "07d7a9f9-7e60-4c3f-a87b-480299693d92" (UID: "07d7a9f9-7e60-4c3f-a87b-480299693d92"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.747660 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "07d7a9f9-7e60-4c3f-a87b-480299693d92" (UID: "07d7a9f9-7e60-4c3f-a87b-480299693d92"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.754606 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-scripts" (OuterVolumeSpecName: "scripts") pod "07d7a9f9-7e60-4c3f-a87b-480299693d92" (UID: "07d7a9f9-7e60-4c3f-a87b-480299693d92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.754687 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d7a9f9-7e60-4c3f-a87b-480299693d92-kube-api-access-26sr4" (OuterVolumeSpecName: "kube-api-access-26sr4") pod "07d7a9f9-7e60-4c3f-a87b-480299693d92" (UID: "07d7a9f9-7e60-4c3f-a87b-480299693d92"). InnerVolumeSpecName "kube-api-access-26sr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.754740 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d7a9f9-7e60-4c3f-a87b-480299693d92-ceph" (OuterVolumeSpecName: "ceph") pod "07d7a9f9-7e60-4c3f-a87b-480299693d92" (UID: "07d7a9f9-7e60-4c3f-a87b-480299693d92"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.801251 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.834286 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07d7a9f9-7e60-4c3f-a87b-480299693d92" (UID: "07d7a9f9-7e60-4c3f-a87b-480299693d92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.847537 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26sr4\" (UniqueName: \"kubernetes.io/projected/07d7a9f9-7e60-4c3f-a87b-480299693d92-kube-api-access-26sr4\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.847572 4753 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07d7a9f9-7e60-4c3f-a87b-480299693d92-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.847584 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.847594 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/07d7a9f9-7e60-4c3f-a87b-480299693d92-ceph\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.847604 4753 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/07d7a9f9-7e60-4c3f-a87b-480299693d92-var-lib-manila\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.847617 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.847627 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.887560 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-config-data" (OuterVolumeSpecName: "config-data") pod "07d7a9f9-7e60-4c3f-a87b-480299693d92" (UID: "07d7a9f9-7e60-4c3f-a87b-480299693d92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:10:58 crc kubenswrapper[4753]: I1005 21:10:58.949119 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d7a9f9-7e60-4c3f-a87b-480299693d92-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.457480 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mmw2v"] Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.663693 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.667869 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmw2v" event={"ID":"d3836426-5f36-4383-a406-a2dafcf7d424","Type":"ContainerStarted","Data":"f9145ad139a84b16cc5ade4612353badd4ac2ff61723d040337636194422c490"} Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.718523 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.727385 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.747195 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 05 21:10:59 crc kubenswrapper[4753]: E1005 21:10:59.747655 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d7a9f9-7e60-4c3f-a87b-480299693d92" containerName="manila-share" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.747670 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d7a9f9-7e60-4c3f-a87b-480299693d92" containerName="manila-share" Oct 05 21:10:59 crc kubenswrapper[4753]: E1005 21:10:59.747679 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d7a9f9-7e60-4c3f-a87b-480299693d92" containerName="probe" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.747685 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d7a9f9-7e60-4c3f-a87b-480299693d92" containerName="probe" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.747924 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d7a9f9-7e60-4c3f-a87b-480299693d92" containerName="probe" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.747943 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d7a9f9-7e60-4c3f-a87b-480299693d92" containerName="manila-share" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.748969 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.752615 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.755507 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.761823 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d8dd124d-011e-41dd-813b-b16ad8039461-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.761886 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8dd124d-011e-41dd-813b-b16ad8039461-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.761923 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8dd124d-011e-41dd-813b-b16ad8039461-config-data\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.761960 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnkwb\" (UniqueName: \"kubernetes.io/projected/d8dd124d-011e-41dd-813b-b16ad8039461-kube-api-access-tnkwb\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.761979 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8dd124d-011e-41dd-813b-b16ad8039461-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.762019 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8dd124d-011e-41dd-813b-b16ad8039461-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.762056 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d8dd124d-011e-41dd-813b-b16ad8039461-ceph\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.762073 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8dd124d-011e-41dd-813b-b16ad8039461-scripts\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.863731 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d8dd124d-011e-41dd-813b-b16ad8039461-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.863849 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d8dd124d-011e-41dd-813b-b16ad8039461-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.864090 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8dd124d-011e-41dd-813b-b16ad8039461-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.865248 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d7a9f9-7e60-4c3f-a87b-480299693d92" path="/var/lib/kubelet/pods/07d7a9f9-7e60-4c3f-a87b-480299693d92/volumes" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.866493 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8dd124d-011e-41dd-813b-b16ad8039461-config-data\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.866599 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnkwb\" (UniqueName: \"kubernetes.io/projected/d8dd124d-011e-41dd-813b-b16ad8039461-kube-api-access-tnkwb\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.866635 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8dd124d-011e-41dd-813b-b16ad8039461-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.866729 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8dd124d-011e-41dd-813b-b16ad8039461-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.866822 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d8dd124d-011e-41dd-813b-b16ad8039461-ceph\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.866843 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8dd124d-011e-41dd-813b-b16ad8039461-scripts\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.868775 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d8dd124d-011e-41dd-813b-b16ad8039461-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.869411 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8dd124d-011e-41dd-813b-b16ad8039461-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.872931 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8dd124d-011e-41dd-813b-b16ad8039461-config-data\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.873338 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8dd124d-011e-41dd-813b-b16ad8039461-scripts\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.874133 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8dd124d-011e-41dd-813b-b16ad8039461-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.874840 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d8dd124d-011e-41dd-813b-b16ad8039461-ceph\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:10:59 crc kubenswrapper[4753]: I1005 21:10:59.889645 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnkwb\" (UniqueName: \"kubernetes.io/projected/d8dd124d-011e-41dd-813b-b16ad8039461-kube-api-access-tnkwb\") pod \"manila-share-share1-0\" (UID: \"d8dd124d-011e-41dd-813b-b16ad8039461\") " pod="openstack/manila-share-share1-0" Oct 05 21:11:00 crc kubenswrapper[4753]: I1005 21:11:00.075913 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 05 21:11:00 crc kubenswrapper[4753]: I1005 21:11:00.660548 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 05 21:11:00 crc kubenswrapper[4753]: I1005 21:11:00.677401 4753 generic.go:334] "Generic (PLEG): container finished" podID="d3836426-5f36-4383-a406-a2dafcf7d424" containerID="77024ee122f493f29050b76182e05fad1a3dc64e33ad1f3fb26978a81922aca3" exitCode=0 Oct 05 21:11:00 crc kubenswrapper[4753]: I1005 21:11:00.677437 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmw2v" event={"ID":"d3836426-5f36-4383-a406-a2dafcf7d424","Type":"ContainerDied","Data":"77024ee122f493f29050b76182e05fad1a3dc64e33ad1f3fb26978a81922aca3"} Oct 05 21:11:00 crc kubenswrapper[4753]: I1005 21:11:00.996071 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 05 21:11:01 crc kubenswrapper[4753]: I1005 21:11:01.686819 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d8dd124d-011e-41dd-813b-b16ad8039461","Type":"ContainerStarted","Data":"51874b54b766e1e1088b67592f3b1c0fb0a2bde4ceba4edfd563d2d9e24403c1"} Oct 05 21:11:01 crc kubenswrapper[4753]: I1005 21:11:01.687300 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d8dd124d-011e-41dd-813b-b16ad8039461","Type":"ContainerStarted","Data":"9d6ad589cd0b34db75a03915893f66396968ad211a755d03f13696f72174dbd6"} Oct 05 21:11:02 crc kubenswrapper[4753]: I1005 21:11:02.696965 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d8dd124d-011e-41dd-813b-b16ad8039461","Type":"ContainerStarted","Data":"3b53cc72a851700f4d0d975096e5d5ae989998ee1f88f322635ca045e5a5d680"} Oct 05 21:11:02 crc kubenswrapper[4753]: I1005 21:11:02.700114 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmw2v" event={"ID":"d3836426-5f36-4383-a406-a2dafcf7d424","Type":"ContainerStarted","Data":"dd1ee084fd3fb90f65623793857b562e0fc406156bd16f7790e1d102fa3f3726"} Oct 05 21:11:02 crc kubenswrapper[4753]: I1005 21:11:02.741269 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.7412457630000002 podStartE2EDuration="3.741245763s" podCreationTimestamp="2025-10-05 21:10:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 21:11:02.718562725 +0000 UTC m=+3371.566890957" watchObservedRunningTime="2025-10-05 21:11:02.741245763 +0000 UTC m=+3371.589573995" Oct 05 21:11:03 crc kubenswrapper[4753]: I1005 21:11:03.712442 4753 generic.go:334] "Generic (PLEG): container finished" podID="ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" containerID="2fd3ed10b25399b5b6080325002d9ee0926b4851ba516017af7dd927b562f21b" exitCode=137 Oct 05 21:11:03 crc kubenswrapper[4753]: I1005 21:11:03.712642 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55b7f7b494-jrzrq" event={"ID":"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac","Type":"ContainerDied","Data":"2fd3ed10b25399b5b6080325002d9ee0926b4851ba516017af7dd927b562f21b"} Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.279132 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.368985 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-horizon-tls-certs\") pod \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.369106 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-config-data\") pod \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.370069 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-logs\") pod \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.370331 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-scripts\") pod \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.370413 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7qjv\" (UniqueName: \"kubernetes.io/projected/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-kube-api-access-n7qjv\") pod \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.370523 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-horizon-secret-key\") pod \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.370590 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-combined-ca-bundle\") pod \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\" (UID: \"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac\") " Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.371027 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-logs" (OuterVolumeSpecName: "logs") pod "ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" (UID: "ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.371527 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-logs\") on node \"crc\" DevicePath \"\"" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.376307 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-kube-api-access-n7qjv" (OuterVolumeSpecName: "kube-api-access-n7qjv") pod "ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" (UID: "ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac"). InnerVolumeSpecName "kube-api-access-n7qjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.389547 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" (UID: "ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.400799 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" (UID: "ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.407806 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-config-data" (OuterVolumeSpecName: "config-data") pod "ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" (UID: "ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.437606 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-scripts" (OuterVolumeSpecName: "scripts") pod "ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" (UID: "ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.438110 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" (UID: "ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.473583 4753 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.473619 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.473629 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-scripts\") on node \"crc\" DevicePath \"\"" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.473638 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7qjv\" (UniqueName: \"kubernetes.io/projected/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-kube-api-access-n7qjv\") on node \"crc\" DevicePath \"\"" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.473650 4753 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.473659 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.725162 4753 generic.go:334] "Generic (PLEG): container finished" podID="d3836426-5f36-4383-a406-a2dafcf7d424" containerID="dd1ee084fd3fb90f65623793857b562e0fc406156bd16f7790e1d102fa3f3726" exitCode=0 Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.725257 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmw2v" event={"ID":"d3836426-5f36-4383-a406-a2dafcf7d424","Type":"ContainerDied","Data":"dd1ee084fd3fb90f65623793857b562e0fc406156bd16f7790e1d102fa3f3726"} Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.727761 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55b7f7b494-jrzrq" event={"ID":"ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac","Type":"ContainerDied","Data":"0e3d8faf8c3de3067cce948a0afcac160c6dad86f73786c528635e84403b916d"} Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.727808 4753 scope.go:117] "RemoveContainer" containerID="bd538def4f88bd5b4cb375fe1ce651aa53f18a5f9a12c0076341cfb4f176b206" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.727954 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55b7f7b494-jrzrq" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.809365 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55b7f7b494-jrzrq"] Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.824599 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-55b7f7b494-jrzrq"] Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.839469 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lnmzx"] Oct 05 21:11:04 crc kubenswrapper[4753]: E1005 21:11:04.840020 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" containerName="horizon-log" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.840038 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" containerName="horizon-log" Oct 05 21:11:04 crc kubenswrapper[4753]: E1005 21:11:04.840089 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" containerName="horizon" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.840097 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" containerName="horizon" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.840331 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" containerName="horizon" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.840360 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" containerName="horizon-log" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.842094 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.857479 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnmzx"] Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.897398 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlzm4\" (UniqueName: \"kubernetes.io/projected/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-kube-api-access-qlzm4\") pod \"redhat-marketplace-lnmzx\" (UID: \"4af6bc21-d912-431d-aeae-ddf8eaaf1c86\") " pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.897630 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-utilities\") pod \"redhat-marketplace-lnmzx\" (UID: \"4af6bc21-d912-431d-aeae-ddf8eaaf1c86\") " pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.897728 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-catalog-content\") pod \"redhat-marketplace-lnmzx\" (UID: \"4af6bc21-d912-431d-aeae-ddf8eaaf1c86\") " pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.999402 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlzm4\" (UniqueName: \"kubernetes.io/projected/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-kube-api-access-qlzm4\") pod \"redhat-marketplace-lnmzx\" (UID: \"4af6bc21-d912-431d-aeae-ddf8eaaf1c86\") " pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.999570 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-utilities\") pod \"redhat-marketplace-lnmzx\" (UID: \"4af6bc21-d912-431d-aeae-ddf8eaaf1c86\") " pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.999602 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-catalog-content\") pod \"redhat-marketplace-lnmzx\" (UID: \"4af6bc21-d912-431d-aeae-ddf8eaaf1c86\") " pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:04 crc kubenswrapper[4753]: I1005 21:11:04.999973 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-catalog-content\") pod \"redhat-marketplace-lnmzx\" (UID: \"4af6bc21-d912-431d-aeae-ddf8eaaf1c86\") " pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:05 crc kubenswrapper[4753]: I1005 21:11:05.000608 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-utilities\") pod \"redhat-marketplace-lnmzx\" (UID: \"4af6bc21-d912-431d-aeae-ddf8eaaf1c86\") " pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:05 crc kubenswrapper[4753]: I1005 21:11:05.007325 4753 scope.go:117] "RemoveContainer" containerID="2fd3ed10b25399b5b6080325002d9ee0926b4851ba516017af7dd927b562f21b" Oct 05 21:11:05 crc kubenswrapper[4753]: I1005 21:11:05.020745 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlzm4\" (UniqueName: \"kubernetes.io/projected/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-kube-api-access-qlzm4\") pod \"redhat-marketplace-lnmzx\" (UID: \"4af6bc21-d912-431d-aeae-ddf8eaaf1c86\") " pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:05 crc kubenswrapper[4753]: I1005 21:11:05.197555 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:05 crc kubenswrapper[4753]: I1005 21:11:05.719120 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnmzx"] Oct 05 21:11:05 crc kubenswrapper[4753]: I1005 21:11:05.755589 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmw2v" event={"ID":"d3836426-5f36-4383-a406-a2dafcf7d424","Type":"ContainerStarted","Data":"b0c076480cc88dad2e8ec87e94bfde5f548d032f7ac777c95002eca5bf98638f"} Oct 05 21:11:05 crc kubenswrapper[4753]: I1005 21:11:05.866287 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac" path="/var/lib/kubelet/pods/ec31fcd6-85bb-4fbf-8cd0-8e29d1de0fac/volumes" Oct 05 21:11:06 crc kubenswrapper[4753]: I1005 21:11:06.775061 4753 generic.go:334] "Generic (PLEG): container finished" podID="4af6bc21-d912-431d-aeae-ddf8eaaf1c86" containerID="3a7dccc9d207694ee35efa01422bdeda48241f2bbcfc33cd524cc71081be947d" exitCode=0 Oct 05 21:11:06 crc kubenswrapper[4753]: I1005 21:11:06.775172 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnmzx" event={"ID":"4af6bc21-d912-431d-aeae-ddf8eaaf1c86","Type":"ContainerDied","Data":"3a7dccc9d207694ee35efa01422bdeda48241f2bbcfc33cd524cc71081be947d"} Oct 05 21:11:06 crc kubenswrapper[4753]: I1005 21:11:06.775460 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnmzx" event={"ID":"4af6bc21-d912-431d-aeae-ddf8eaaf1c86","Type":"ContainerStarted","Data":"b701e7c0c7303f90ce423f237a2debe3ab4b4858de5613646c80e4bdb1f603c9"} Oct 05 21:11:06 crc kubenswrapper[4753]: I1005 21:11:06.804506 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mmw2v" podStartSLOduration=4.138981223 podStartE2EDuration="8.80448505s" podCreationTimestamp="2025-10-05 21:10:58 +0000 UTC" firstStartedPulling="2025-10-05 21:11:00.68997642 +0000 UTC m=+3369.538304652" lastFinishedPulling="2025-10-05 21:11:05.355480247 +0000 UTC m=+3374.203808479" observedRunningTime="2025-10-05 21:11:05.781736442 +0000 UTC m=+3374.630064674" watchObservedRunningTime="2025-10-05 21:11:06.80448505 +0000 UTC m=+3375.652813282" Oct 05 21:11:07 crc kubenswrapper[4753]: I1005 21:11:07.199405 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vc9ss"] Oct 05 21:11:07 crc kubenswrapper[4753]: I1005 21:11:07.206469 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:07 crc kubenswrapper[4753]: I1005 21:11:07.221713 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vc9ss"] Oct 05 21:11:07 crc kubenswrapper[4753]: I1005 21:11:07.247670 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrm58\" (UniqueName: \"kubernetes.io/projected/c45c313b-5228-4055-826f-bd89b3ef23b8-kube-api-access-nrm58\") pod \"redhat-operators-vc9ss\" (UID: \"c45c313b-5228-4055-826f-bd89b3ef23b8\") " pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:07 crc kubenswrapper[4753]: I1005 21:11:07.247815 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c45c313b-5228-4055-826f-bd89b3ef23b8-utilities\") pod \"redhat-operators-vc9ss\" (UID: \"c45c313b-5228-4055-826f-bd89b3ef23b8\") " pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:07 crc kubenswrapper[4753]: I1005 21:11:07.247912 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c45c313b-5228-4055-826f-bd89b3ef23b8-catalog-content\") pod \"redhat-operators-vc9ss\" (UID: \"c45c313b-5228-4055-826f-bd89b3ef23b8\") " pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:07 crc kubenswrapper[4753]: I1005 21:11:07.349946 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c45c313b-5228-4055-826f-bd89b3ef23b8-catalog-content\") pod \"redhat-operators-vc9ss\" (UID: \"c45c313b-5228-4055-826f-bd89b3ef23b8\") " pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:07 crc kubenswrapper[4753]: I1005 21:11:07.350066 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrm58\" (UniqueName: \"kubernetes.io/projected/c45c313b-5228-4055-826f-bd89b3ef23b8-kube-api-access-nrm58\") pod \"redhat-operators-vc9ss\" (UID: \"c45c313b-5228-4055-826f-bd89b3ef23b8\") " pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:07 crc kubenswrapper[4753]: I1005 21:11:07.350125 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c45c313b-5228-4055-826f-bd89b3ef23b8-utilities\") pod \"redhat-operators-vc9ss\" (UID: \"c45c313b-5228-4055-826f-bd89b3ef23b8\") " pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:07 crc kubenswrapper[4753]: I1005 21:11:07.350624 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c45c313b-5228-4055-826f-bd89b3ef23b8-utilities\") pod \"redhat-operators-vc9ss\" (UID: \"c45c313b-5228-4055-826f-bd89b3ef23b8\") " pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:07 crc kubenswrapper[4753]: I1005 21:11:07.350842 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c45c313b-5228-4055-826f-bd89b3ef23b8-catalog-content\") pod \"redhat-operators-vc9ss\" (UID: \"c45c313b-5228-4055-826f-bd89b3ef23b8\") " pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:07 crc kubenswrapper[4753]: I1005 21:11:07.369773 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrm58\" (UniqueName: \"kubernetes.io/projected/c45c313b-5228-4055-826f-bd89b3ef23b8-kube-api-access-nrm58\") pod \"redhat-operators-vc9ss\" (UID: \"c45c313b-5228-4055-826f-bd89b3ef23b8\") " pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:07 crc kubenswrapper[4753]: I1005 21:11:07.522443 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:07 crc kubenswrapper[4753]: I1005 21:11:07.797258 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnmzx" event={"ID":"4af6bc21-d912-431d-aeae-ddf8eaaf1c86","Type":"ContainerStarted","Data":"6dba71f4ca5390043ef19cd350f18b36cd283b0b9da33f2ad5dd4ef716bef822"} Oct 05 21:11:08 crc kubenswrapper[4753]: I1005 21:11:08.026296 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vc9ss"] Oct 05 21:11:08 crc kubenswrapper[4753]: I1005 21:11:08.802339 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:11:08 crc kubenswrapper[4753]: I1005 21:11:08.802725 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:11:08 crc kubenswrapper[4753]: I1005 21:11:08.818514 4753 generic.go:334] "Generic (PLEG): container finished" podID="4af6bc21-d912-431d-aeae-ddf8eaaf1c86" containerID="6dba71f4ca5390043ef19cd350f18b36cd283b0b9da33f2ad5dd4ef716bef822" exitCode=0 Oct 05 21:11:08 crc kubenswrapper[4753]: I1005 21:11:08.818582 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnmzx" event={"ID":"4af6bc21-d912-431d-aeae-ddf8eaaf1c86","Type":"ContainerDied","Data":"6dba71f4ca5390043ef19cd350f18b36cd283b0b9da33f2ad5dd4ef716bef822"} Oct 05 21:11:08 crc kubenswrapper[4753]: I1005 21:11:08.827295 4753 generic.go:334] "Generic (PLEG): container finished" podID="c45c313b-5228-4055-826f-bd89b3ef23b8" containerID="adbbddc9eeeed625685d9c6aebad7b58fccb009cfe21b9ddbff83e8e2c46f5f4" exitCode=0 Oct 05 21:11:08 crc kubenswrapper[4753]: I1005 21:11:08.827341 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vc9ss" event={"ID":"c45c313b-5228-4055-826f-bd89b3ef23b8","Type":"ContainerDied","Data":"adbbddc9eeeed625685d9c6aebad7b58fccb009cfe21b9ddbff83e8e2c46f5f4"} Oct 05 21:11:08 crc kubenswrapper[4753]: I1005 21:11:08.827463 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vc9ss" event={"ID":"c45c313b-5228-4055-826f-bd89b3ef23b8","Type":"ContainerStarted","Data":"72d75f614102b15776a504bdc435db5d656bab8678dcbb8f2fa232bd3f11b642"} Oct 05 21:11:09 crc kubenswrapper[4753]: I1005 21:11:09.838710 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnmzx" event={"ID":"4af6bc21-d912-431d-aeae-ddf8eaaf1c86","Type":"ContainerStarted","Data":"beb199e132a3c45e37113344a12825b4c24220de995d0f730e3772aabfb0866f"} Oct 05 21:11:09 crc kubenswrapper[4753]: I1005 21:11:09.867672 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lnmzx" podStartSLOduration=3.413461115 podStartE2EDuration="5.867657896s" podCreationTimestamp="2025-10-05 21:11:04 +0000 UTC" firstStartedPulling="2025-10-05 21:11:06.777933642 +0000 UTC m=+3375.626261884" lastFinishedPulling="2025-10-05 21:11:09.232130433 +0000 UTC m=+3378.080458665" observedRunningTime="2025-10-05 21:11:09.861418944 +0000 UTC m=+3378.709747176" watchObservedRunningTime="2025-10-05 21:11:09.867657896 +0000 UTC m=+3378.715986128" Oct 05 21:11:09 crc kubenswrapper[4753]: I1005 21:11:09.870794 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-mmw2v" podUID="d3836426-5f36-4383-a406-a2dafcf7d424" containerName="registry-server" probeResult="failure" output=< Oct 05 21:11:09 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 21:11:09 crc kubenswrapper[4753]: > Oct 05 21:11:10 crc kubenswrapper[4753]: I1005 21:11:10.077431 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 05 21:11:10 crc kubenswrapper[4753]: I1005 21:11:10.858397 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vc9ss" event={"ID":"c45c313b-5228-4055-826f-bd89b3ef23b8","Type":"ContainerStarted","Data":"9b49c8090408d9e333888c3576744b4df5daf91bc3beac4f616c55b8cf15bf41"} Oct 05 21:11:12 crc kubenswrapper[4753]: I1005 21:11:12.809478 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 05 21:11:14 crc kubenswrapper[4753]: I1005 21:11:14.931308 4753 generic.go:334] "Generic (PLEG): container finished" podID="c45c313b-5228-4055-826f-bd89b3ef23b8" containerID="9b49c8090408d9e333888c3576744b4df5daf91bc3beac4f616c55b8cf15bf41" exitCode=0 Oct 05 21:11:14 crc kubenswrapper[4753]: I1005 21:11:14.931422 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vc9ss" event={"ID":"c45c313b-5228-4055-826f-bd89b3ef23b8","Type":"ContainerDied","Data":"9b49c8090408d9e333888c3576744b4df5daf91bc3beac4f616c55b8cf15bf41"} Oct 05 21:11:15 crc kubenswrapper[4753]: I1005 21:11:15.199321 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:15 crc kubenswrapper[4753]: I1005 21:11:15.199358 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:15 crc kubenswrapper[4753]: I1005 21:11:15.255026 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:15 crc kubenswrapper[4753]: I1005 21:11:15.950536 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vc9ss" event={"ID":"c45c313b-5228-4055-826f-bd89b3ef23b8","Type":"ContainerStarted","Data":"00bb0be23cb47229d818a6a2653ee1a6b3db1dc0b7e3658da5fe9fc5b9322ec6"} Oct 05 21:11:15 crc kubenswrapper[4753]: I1005 21:11:15.971726 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vc9ss" podStartSLOduration=2.271501244 podStartE2EDuration="8.971706355s" podCreationTimestamp="2025-10-05 21:11:07 +0000 UTC" firstStartedPulling="2025-10-05 21:11:08.829855915 +0000 UTC m=+3377.678184147" lastFinishedPulling="2025-10-05 21:11:15.530061026 +0000 UTC m=+3384.378389258" observedRunningTime="2025-10-05 21:11:15.968496576 +0000 UTC m=+3384.816824808" watchObservedRunningTime="2025-10-05 21:11:15.971706355 +0000 UTC m=+3384.820034587" Oct 05 21:11:15 crc kubenswrapper[4753]: I1005 21:11:15.997724 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:17 crc kubenswrapper[4753]: I1005 21:11:17.202893 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnmzx"] Oct 05 21:11:17 crc kubenswrapper[4753]: I1005 21:11:17.527539 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:17 crc kubenswrapper[4753]: I1005 21:11:17.527588 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:17 crc kubenswrapper[4753]: I1005 21:11:17.965485 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lnmzx" podUID="4af6bc21-d912-431d-aeae-ddf8eaaf1c86" containerName="registry-server" containerID="cri-o://beb199e132a3c45e37113344a12825b4c24220de995d0f730e3772aabfb0866f" gracePeriod=2 Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.475636 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.557351 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlzm4\" (UniqueName: \"kubernetes.io/projected/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-kube-api-access-qlzm4\") pod \"4af6bc21-d912-431d-aeae-ddf8eaaf1c86\" (UID: \"4af6bc21-d912-431d-aeae-ddf8eaaf1c86\") " Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.557403 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-catalog-content\") pod \"4af6bc21-d912-431d-aeae-ddf8eaaf1c86\" (UID: \"4af6bc21-d912-431d-aeae-ddf8eaaf1c86\") " Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.557474 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-utilities\") pod \"4af6bc21-d912-431d-aeae-ddf8eaaf1c86\" (UID: \"4af6bc21-d912-431d-aeae-ddf8eaaf1c86\") " Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.560932 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-utilities" (OuterVolumeSpecName: "utilities") pod "4af6bc21-d912-431d-aeae-ddf8eaaf1c86" (UID: "4af6bc21-d912-431d-aeae-ddf8eaaf1c86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.565521 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-kube-api-access-qlzm4" (OuterVolumeSpecName: "kube-api-access-qlzm4") pod "4af6bc21-d912-431d-aeae-ddf8eaaf1c86" (UID: "4af6bc21-d912-431d-aeae-ddf8eaaf1c86"). InnerVolumeSpecName "kube-api-access-qlzm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.577215 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4af6bc21-d912-431d-aeae-ddf8eaaf1c86" (UID: "4af6bc21-d912-431d-aeae-ddf8eaaf1c86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.585316 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vc9ss" podUID="c45c313b-5228-4055-826f-bd89b3ef23b8" containerName="registry-server" probeResult="failure" output=< Oct 05 21:11:18 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 21:11:18 crc kubenswrapper[4753]: > Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.659814 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.659869 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlzm4\" (UniqueName: \"kubernetes.io/projected/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-kube-api-access-qlzm4\") on node \"crc\" DevicePath \"\"" Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.659892 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4af6bc21-d912-431d-aeae-ddf8eaaf1c86-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.848834 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.899975 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.976996 4753 generic.go:334] "Generic (PLEG): container finished" podID="4af6bc21-d912-431d-aeae-ddf8eaaf1c86" containerID="beb199e132a3c45e37113344a12825b4c24220de995d0f730e3772aabfb0866f" exitCode=0 Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.977079 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnmzx" Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.977086 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnmzx" event={"ID":"4af6bc21-d912-431d-aeae-ddf8eaaf1c86","Type":"ContainerDied","Data":"beb199e132a3c45e37113344a12825b4c24220de995d0f730e3772aabfb0866f"} Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.977161 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnmzx" event={"ID":"4af6bc21-d912-431d-aeae-ddf8eaaf1c86","Type":"ContainerDied","Data":"b701e7c0c7303f90ce423f237a2debe3ab4b4858de5613646c80e4bdb1f603c9"} Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.977184 4753 scope.go:117] "RemoveContainer" containerID="beb199e132a3c45e37113344a12825b4c24220de995d0f730e3772aabfb0866f" Oct 05 21:11:18 crc kubenswrapper[4753]: I1005 21:11:18.998337 4753 scope.go:117] "RemoveContainer" containerID="6dba71f4ca5390043ef19cd350f18b36cd283b0b9da33f2ad5dd4ef716bef822" Oct 05 21:11:19 crc kubenswrapper[4753]: I1005 21:11:19.022048 4753 scope.go:117] "RemoveContainer" containerID="3a7dccc9d207694ee35efa01422bdeda48241f2bbcfc33cd524cc71081be947d" Oct 05 21:11:19 crc kubenswrapper[4753]: I1005 21:11:19.022195 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnmzx"] Oct 05 21:11:19 crc kubenswrapper[4753]: I1005 21:11:19.024638 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnmzx"] Oct 05 21:11:19 crc kubenswrapper[4753]: I1005 21:11:19.062836 4753 scope.go:117] "RemoveContainer" containerID="beb199e132a3c45e37113344a12825b4c24220de995d0f730e3772aabfb0866f" Oct 05 21:11:19 crc kubenswrapper[4753]: E1005 21:11:19.065267 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beb199e132a3c45e37113344a12825b4c24220de995d0f730e3772aabfb0866f\": container with ID starting with beb199e132a3c45e37113344a12825b4c24220de995d0f730e3772aabfb0866f not found: ID does not exist" containerID="beb199e132a3c45e37113344a12825b4c24220de995d0f730e3772aabfb0866f" Oct 05 21:11:19 crc kubenswrapper[4753]: I1005 21:11:19.065298 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb199e132a3c45e37113344a12825b4c24220de995d0f730e3772aabfb0866f"} err="failed to get container status \"beb199e132a3c45e37113344a12825b4c24220de995d0f730e3772aabfb0866f\": rpc error: code = NotFound desc = could not find container \"beb199e132a3c45e37113344a12825b4c24220de995d0f730e3772aabfb0866f\": container with ID starting with beb199e132a3c45e37113344a12825b4c24220de995d0f730e3772aabfb0866f not found: ID does not exist" Oct 05 21:11:19 crc kubenswrapper[4753]: I1005 21:11:19.065318 4753 scope.go:117] "RemoveContainer" containerID="6dba71f4ca5390043ef19cd350f18b36cd283b0b9da33f2ad5dd4ef716bef822" Oct 05 21:11:19 crc kubenswrapper[4753]: E1005 21:11:19.065632 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dba71f4ca5390043ef19cd350f18b36cd283b0b9da33f2ad5dd4ef716bef822\": container with ID starting with 6dba71f4ca5390043ef19cd350f18b36cd283b0b9da33f2ad5dd4ef716bef822 not found: ID does not exist" containerID="6dba71f4ca5390043ef19cd350f18b36cd283b0b9da33f2ad5dd4ef716bef822" Oct 05 21:11:19 crc kubenswrapper[4753]: I1005 21:11:19.065743 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dba71f4ca5390043ef19cd350f18b36cd283b0b9da33f2ad5dd4ef716bef822"} err="failed to get container status \"6dba71f4ca5390043ef19cd350f18b36cd283b0b9da33f2ad5dd4ef716bef822\": rpc error: code = NotFound desc = could not find container \"6dba71f4ca5390043ef19cd350f18b36cd283b0b9da33f2ad5dd4ef716bef822\": container with ID starting with 6dba71f4ca5390043ef19cd350f18b36cd283b0b9da33f2ad5dd4ef716bef822 not found: ID does not exist" Oct 05 21:11:19 crc kubenswrapper[4753]: I1005 21:11:19.065849 4753 scope.go:117] "RemoveContainer" containerID="3a7dccc9d207694ee35efa01422bdeda48241f2bbcfc33cd524cc71081be947d" Oct 05 21:11:19 crc kubenswrapper[4753]: E1005 21:11:19.066521 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a7dccc9d207694ee35efa01422bdeda48241f2bbcfc33cd524cc71081be947d\": container with ID starting with 3a7dccc9d207694ee35efa01422bdeda48241f2bbcfc33cd524cc71081be947d not found: ID does not exist" containerID="3a7dccc9d207694ee35efa01422bdeda48241f2bbcfc33cd524cc71081be947d" Oct 05 21:11:19 crc kubenswrapper[4753]: I1005 21:11:19.066569 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a7dccc9d207694ee35efa01422bdeda48241f2bbcfc33cd524cc71081be947d"} err="failed to get container status \"3a7dccc9d207694ee35efa01422bdeda48241f2bbcfc33cd524cc71081be947d\": rpc error: code = NotFound desc = could not find container \"3a7dccc9d207694ee35efa01422bdeda48241f2bbcfc33cd524cc71081be947d\": container with ID starting with 3a7dccc9d207694ee35efa01422bdeda48241f2bbcfc33cd524cc71081be947d not found: ID does not exist" Oct 05 21:11:19 crc kubenswrapper[4753]: I1005 21:11:19.861936 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4af6bc21-d912-431d-aeae-ddf8eaaf1c86" path="/var/lib/kubelet/pods/4af6bc21-d912-431d-aeae-ddf8eaaf1c86/volumes" Oct 05 21:11:20 crc kubenswrapper[4753]: I1005 21:11:20.989913 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 05 21:11:21 crc kubenswrapper[4753]: I1005 21:11:21.196626 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mmw2v"] Oct 05 21:11:21 crc kubenswrapper[4753]: I1005 21:11:21.197210 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mmw2v" podUID="d3836426-5f36-4383-a406-a2dafcf7d424" containerName="registry-server" containerID="cri-o://b0c076480cc88dad2e8ec87e94bfde5f548d032f7ac777c95002eca5bf98638f" gracePeriod=2 Oct 05 21:11:21 crc kubenswrapper[4753]: I1005 21:11:21.634066 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:11:21 crc kubenswrapper[4753]: I1005 21:11:21.739070 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 05 21:11:21 crc kubenswrapper[4753]: I1005 21:11:21.819286 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6wf2\" (UniqueName: \"kubernetes.io/projected/d3836426-5f36-4383-a406-a2dafcf7d424-kube-api-access-w6wf2\") pod \"d3836426-5f36-4383-a406-a2dafcf7d424\" (UID: \"d3836426-5f36-4383-a406-a2dafcf7d424\") " Oct 05 21:11:21 crc kubenswrapper[4753]: I1005 21:11:21.819419 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3836426-5f36-4383-a406-a2dafcf7d424-utilities\") pod \"d3836426-5f36-4383-a406-a2dafcf7d424\" (UID: \"d3836426-5f36-4383-a406-a2dafcf7d424\") " Oct 05 21:11:21 crc kubenswrapper[4753]: I1005 21:11:21.819520 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3836426-5f36-4383-a406-a2dafcf7d424-catalog-content\") pod \"d3836426-5f36-4383-a406-a2dafcf7d424\" (UID: \"d3836426-5f36-4383-a406-a2dafcf7d424\") " Oct 05 21:11:21 crc kubenswrapper[4753]: I1005 21:11:21.820570 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3836426-5f36-4383-a406-a2dafcf7d424-utilities" (OuterVolumeSpecName: "utilities") pod "d3836426-5f36-4383-a406-a2dafcf7d424" (UID: "d3836426-5f36-4383-a406-a2dafcf7d424"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:11:21 crc kubenswrapper[4753]: I1005 21:11:21.835344 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3836426-5f36-4383-a406-a2dafcf7d424-kube-api-access-w6wf2" (OuterVolumeSpecName: "kube-api-access-w6wf2") pod "d3836426-5f36-4383-a406-a2dafcf7d424" (UID: "d3836426-5f36-4383-a406-a2dafcf7d424"). InnerVolumeSpecName "kube-api-access-w6wf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:11:21 crc kubenswrapper[4753]: I1005 21:11:21.892831 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3836426-5f36-4383-a406-a2dafcf7d424-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3836426-5f36-4383-a406-a2dafcf7d424" (UID: "d3836426-5f36-4383-a406-a2dafcf7d424"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:11:21 crc kubenswrapper[4753]: I1005 21:11:21.922281 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3836426-5f36-4383-a406-a2dafcf7d424-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 21:11:21 crc kubenswrapper[4753]: I1005 21:11:21.922313 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3836426-5f36-4383-a406-a2dafcf7d424-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 21:11:21 crc kubenswrapper[4753]: I1005 21:11:21.922325 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6wf2\" (UniqueName: \"kubernetes.io/projected/d3836426-5f36-4383-a406-a2dafcf7d424-kube-api-access-w6wf2\") on node \"crc\" DevicePath \"\"" Oct 05 21:11:22 crc kubenswrapper[4753]: I1005 21:11:22.007066 4753 generic.go:334] "Generic (PLEG): container finished" podID="d3836426-5f36-4383-a406-a2dafcf7d424" containerID="b0c076480cc88dad2e8ec87e94bfde5f548d032f7ac777c95002eca5bf98638f" exitCode=0 Oct 05 21:11:22 crc kubenswrapper[4753]: I1005 21:11:22.007110 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmw2v" event={"ID":"d3836426-5f36-4383-a406-a2dafcf7d424","Type":"ContainerDied","Data":"b0c076480cc88dad2e8ec87e94bfde5f548d032f7ac777c95002eca5bf98638f"} Oct 05 21:11:22 crc kubenswrapper[4753]: I1005 21:11:22.007148 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmw2v" event={"ID":"d3836426-5f36-4383-a406-a2dafcf7d424","Type":"ContainerDied","Data":"f9145ad139a84b16cc5ade4612353badd4ac2ff61723d040337636194422c490"} Oct 05 21:11:22 crc kubenswrapper[4753]: I1005 21:11:22.007165 4753 scope.go:117] "RemoveContainer" containerID="b0c076480cc88dad2e8ec87e94bfde5f548d032f7ac777c95002eca5bf98638f" Oct 05 21:11:22 crc kubenswrapper[4753]: I1005 21:11:22.007192 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mmw2v" Oct 05 21:11:22 crc kubenswrapper[4753]: I1005 21:11:22.046809 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mmw2v"] Oct 05 21:11:22 crc kubenswrapper[4753]: I1005 21:11:22.051679 4753 scope.go:117] "RemoveContainer" containerID="dd1ee084fd3fb90f65623793857b562e0fc406156bd16f7790e1d102fa3f3726" Oct 05 21:11:22 crc kubenswrapper[4753]: I1005 21:11:22.053431 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mmw2v"] Oct 05 21:11:22 crc kubenswrapper[4753]: I1005 21:11:22.072338 4753 scope.go:117] "RemoveContainer" containerID="77024ee122f493f29050b76182e05fad1a3dc64e33ad1f3fb26978a81922aca3" Oct 05 21:11:22 crc kubenswrapper[4753]: I1005 21:11:22.109358 4753 scope.go:117] "RemoveContainer" containerID="b0c076480cc88dad2e8ec87e94bfde5f548d032f7ac777c95002eca5bf98638f" Oct 05 21:11:22 crc kubenswrapper[4753]: E1005 21:11:22.109744 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0c076480cc88dad2e8ec87e94bfde5f548d032f7ac777c95002eca5bf98638f\": container with ID starting with b0c076480cc88dad2e8ec87e94bfde5f548d032f7ac777c95002eca5bf98638f not found: ID does not exist" containerID="b0c076480cc88dad2e8ec87e94bfde5f548d032f7ac777c95002eca5bf98638f" Oct 05 21:11:22 crc kubenswrapper[4753]: I1005 21:11:22.109797 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c076480cc88dad2e8ec87e94bfde5f548d032f7ac777c95002eca5bf98638f"} err="failed to get container status \"b0c076480cc88dad2e8ec87e94bfde5f548d032f7ac777c95002eca5bf98638f\": rpc error: code = NotFound desc = could not find container \"b0c076480cc88dad2e8ec87e94bfde5f548d032f7ac777c95002eca5bf98638f\": container with ID starting with b0c076480cc88dad2e8ec87e94bfde5f548d032f7ac777c95002eca5bf98638f not found: ID does not exist" Oct 05 21:11:22 crc kubenswrapper[4753]: I1005 21:11:22.109827 4753 scope.go:117] "RemoveContainer" containerID="dd1ee084fd3fb90f65623793857b562e0fc406156bd16f7790e1d102fa3f3726" Oct 05 21:11:22 crc kubenswrapper[4753]: E1005 21:11:22.110358 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd1ee084fd3fb90f65623793857b562e0fc406156bd16f7790e1d102fa3f3726\": container with ID starting with dd1ee084fd3fb90f65623793857b562e0fc406156bd16f7790e1d102fa3f3726 not found: ID does not exist" containerID="dd1ee084fd3fb90f65623793857b562e0fc406156bd16f7790e1d102fa3f3726" Oct 05 21:11:22 crc kubenswrapper[4753]: I1005 21:11:22.110398 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1ee084fd3fb90f65623793857b562e0fc406156bd16f7790e1d102fa3f3726"} err="failed to get container status \"dd1ee084fd3fb90f65623793857b562e0fc406156bd16f7790e1d102fa3f3726\": rpc error: code = NotFound desc = could not find container \"dd1ee084fd3fb90f65623793857b562e0fc406156bd16f7790e1d102fa3f3726\": container with ID starting with dd1ee084fd3fb90f65623793857b562e0fc406156bd16f7790e1d102fa3f3726 not found: ID does not exist" Oct 05 21:11:22 crc kubenswrapper[4753]: I1005 21:11:22.110428 4753 scope.go:117] "RemoveContainer" containerID="77024ee122f493f29050b76182e05fad1a3dc64e33ad1f3fb26978a81922aca3" Oct 05 21:11:22 crc kubenswrapper[4753]: E1005 21:11:22.110703 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77024ee122f493f29050b76182e05fad1a3dc64e33ad1f3fb26978a81922aca3\": container with ID starting with 77024ee122f493f29050b76182e05fad1a3dc64e33ad1f3fb26978a81922aca3 not found: ID does not exist" containerID="77024ee122f493f29050b76182e05fad1a3dc64e33ad1f3fb26978a81922aca3" Oct 05 21:11:22 crc kubenswrapper[4753]: I1005 21:11:22.110725 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77024ee122f493f29050b76182e05fad1a3dc64e33ad1f3fb26978a81922aca3"} err="failed to get container status \"77024ee122f493f29050b76182e05fad1a3dc64e33ad1f3fb26978a81922aca3\": rpc error: code = NotFound desc = could not find container \"77024ee122f493f29050b76182e05fad1a3dc64e33ad1f3fb26978a81922aca3\": container with ID starting with 77024ee122f493f29050b76182e05fad1a3dc64e33ad1f3fb26978a81922aca3 not found: ID does not exist" Oct 05 21:11:23 crc kubenswrapper[4753]: I1005 21:11:23.864658 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3836426-5f36-4383-a406-a2dafcf7d424" path="/var/lib/kubelet/pods/d3836426-5f36-4383-a406-a2dafcf7d424/volumes" Oct 05 21:11:28 crc kubenswrapper[4753]: I1005 21:11:28.566524 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vc9ss" podUID="c45c313b-5228-4055-826f-bd89b3ef23b8" containerName="registry-server" probeResult="failure" output=< Oct 05 21:11:28 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 21:11:28 crc kubenswrapper[4753]: > Oct 05 21:11:34 crc kubenswrapper[4753]: I1005 21:11:34.490653 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:11:34 crc kubenswrapper[4753]: I1005 21:11:34.491211 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:11:37 crc kubenswrapper[4753]: I1005 21:11:37.578787 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:37 crc kubenswrapper[4753]: I1005 21:11:37.633525 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:37 crc kubenswrapper[4753]: I1005 21:11:37.821856 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vc9ss"] Oct 05 21:11:39 crc kubenswrapper[4753]: I1005 21:11:39.162461 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vc9ss" podUID="c45c313b-5228-4055-826f-bd89b3ef23b8" containerName="registry-server" containerID="cri-o://00bb0be23cb47229d818a6a2653ee1a6b3db1dc0b7e3658da5fe9fc5b9322ec6" gracePeriod=2 Oct 05 21:11:39 crc kubenswrapper[4753]: I1005 21:11:39.622988 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:39 crc kubenswrapper[4753]: I1005 21:11:39.677935 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c45c313b-5228-4055-826f-bd89b3ef23b8-utilities\") pod \"c45c313b-5228-4055-826f-bd89b3ef23b8\" (UID: \"c45c313b-5228-4055-826f-bd89b3ef23b8\") " Oct 05 21:11:39 crc kubenswrapper[4753]: I1005 21:11:39.678091 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c45c313b-5228-4055-826f-bd89b3ef23b8-catalog-content\") pod \"c45c313b-5228-4055-826f-bd89b3ef23b8\" (UID: \"c45c313b-5228-4055-826f-bd89b3ef23b8\") " Oct 05 21:11:39 crc kubenswrapper[4753]: I1005 21:11:39.678278 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrm58\" (UniqueName: \"kubernetes.io/projected/c45c313b-5228-4055-826f-bd89b3ef23b8-kube-api-access-nrm58\") pod \"c45c313b-5228-4055-826f-bd89b3ef23b8\" (UID: \"c45c313b-5228-4055-826f-bd89b3ef23b8\") " Oct 05 21:11:39 crc kubenswrapper[4753]: I1005 21:11:39.678846 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c45c313b-5228-4055-826f-bd89b3ef23b8-utilities" (OuterVolumeSpecName: "utilities") pod "c45c313b-5228-4055-826f-bd89b3ef23b8" (UID: "c45c313b-5228-4055-826f-bd89b3ef23b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:11:39 crc kubenswrapper[4753]: I1005 21:11:39.688089 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c45c313b-5228-4055-826f-bd89b3ef23b8-kube-api-access-nrm58" (OuterVolumeSpecName: "kube-api-access-nrm58") pod "c45c313b-5228-4055-826f-bd89b3ef23b8" (UID: "c45c313b-5228-4055-826f-bd89b3ef23b8"). InnerVolumeSpecName "kube-api-access-nrm58". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:11:39 crc kubenswrapper[4753]: I1005 21:11:39.783402 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrm58\" (UniqueName: \"kubernetes.io/projected/c45c313b-5228-4055-826f-bd89b3ef23b8-kube-api-access-nrm58\") on node \"crc\" DevicePath \"\"" Oct 05 21:11:39 crc kubenswrapper[4753]: I1005 21:11:39.783436 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c45c313b-5228-4055-826f-bd89b3ef23b8-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 21:11:39 crc kubenswrapper[4753]: I1005 21:11:39.800902 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c45c313b-5228-4055-826f-bd89b3ef23b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c45c313b-5228-4055-826f-bd89b3ef23b8" (UID: "c45c313b-5228-4055-826f-bd89b3ef23b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:11:39 crc kubenswrapper[4753]: I1005 21:11:39.884972 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c45c313b-5228-4055-826f-bd89b3ef23b8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 21:11:40 crc kubenswrapper[4753]: I1005 21:11:40.171802 4753 generic.go:334] "Generic (PLEG): container finished" podID="c45c313b-5228-4055-826f-bd89b3ef23b8" containerID="00bb0be23cb47229d818a6a2653ee1a6b3db1dc0b7e3658da5fe9fc5b9322ec6" exitCode=0 Oct 05 21:11:40 crc kubenswrapper[4753]: I1005 21:11:40.171844 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vc9ss" event={"ID":"c45c313b-5228-4055-826f-bd89b3ef23b8","Type":"ContainerDied","Data":"00bb0be23cb47229d818a6a2653ee1a6b3db1dc0b7e3658da5fe9fc5b9322ec6"} Oct 05 21:11:40 crc kubenswrapper[4753]: I1005 21:11:40.171872 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vc9ss" event={"ID":"c45c313b-5228-4055-826f-bd89b3ef23b8","Type":"ContainerDied","Data":"72d75f614102b15776a504bdc435db5d656bab8678dcbb8f2fa232bd3f11b642"} Oct 05 21:11:40 crc kubenswrapper[4753]: I1005 21:11:40.171889 4753 scope.go:117] "RemoveContainer" containerID="00bb0be23cb47229d818a6a2653ee1a6b3db1dc0b7e3658da5fe9fc5b9322ec6" Oct 05 21:11:40 crc kubenswrapper[4753]: I1005 21:11:40.172036 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vc9ss" Oct 05 21:11:40 crc kubenswrapper[4753]: I1005 21:11:40.205447 4753 scope.go:117] "RemoveContainer" containerID="9b49c8090408d9e333888c3576744b4df5daf91bc3beac4f616c55b8cf15bf41" Oct 05 21:11:40 crc kubenswrapper[4753]: I1005 21:11:40.236395 4753 scope.go:117] "RemoveContainer" containerID="adbbddc9eeeed625685d9c6aebad7b58fccb009cfe21b9ddbff83e8e2c46f5f4" Oct 05 21:11:40 crc kubenswrapper[4753]: I1005 21:11:40.236417 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vc9ss"] Oct 05 21:11:40 crc kubenswrapper[4753]: I1005 21:11:40.252059 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vc9ss"] Oct 05 21:11:40 crc kubenswrapper[4753]: I1005 21:11:40.279841 4753 scope.go:117] "RemoveContainer" containerID="00bb0be23cb47229d818a6a2653ee1a6b3db1dc0b7e3658da5fe9fc5b9322ec6" Oct 05 21:11:40 crc kubenswrapper[4753]: E1005 21:11:40.280264 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00bb0be23cb47229d818a6a2653ee1a6b3db1dc0b7e3658da5fe9fc5b9322ec6\": container with ID starting with 00bb0be23cb47229d818a6a2653ee1a6b3db1dc0b7e3658da5fe9fc5b9322ec6 not found: ID does not exist" containerID="00bb0be23cb47229d818a6a2653ee1a6b3db1dc0b7e3658da5fe9fc5b9322ec6" Oct 05 21:11:40 crc kubenswrapper[4753]: I1005 21:11:40.280292 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00bb0be23cb47229d818a6a2653ee1a6b3db1dc0b7e3658da5fe9fc5b9322ec6"} err="failed to get container status \"00bb0be23cb47229d818a6a2653ee1a6b3db1dc0b7e3658da5fe9fc5b9322ec6\": rpc error: code = NotFound desc = could not find container \"00bb0be23cb47229d818a6a2653ee1a6b3db1dc0b7e3658da5fe9fc5b9322ec6\": container with ID starting with 00bb0be23cb47229d818a6a2653ee1a6b3db1dc0b7e3658da5fe9fc5b9322ec6 not found: ID does not exist" Oct 05 21:11:40 crc kubenswrapper[4753]: I1005 21:11:40.280311 4753 scope.go:117] "RemoveContainer" containerID="9b49c8090408d9e333888c3576744b4df5daf91bc3beac4f616c55b8cf15bf41" Oct 05 21:11:40 crc kubenswrapper[4753]: E1005 21:11:40.281571 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b49c8090408d9e333888c3576744b4df5daf91bc3beac4f616c55b8cf15bf41\": container with ID starting with 9b49c8090408d9e333888c3576744b4df5daf91bc3beac4f616c55b8cf15bf41 not found: ID does not exist" containerID="9b49c8090408d9e333888c3576744b4df5daf91bc3beac4f616c55b8cf15bf41" Oct 05 21:11:40 crc kubenswrapper[4753]: I1005 21:11:40.281594 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b49c8090408d9e333888c3576744b4df5daf91bc3beac4f616c55b8cf15bf41"} err="failed to get container status \"9b49c8090408d9e333888c3576744b4df5daf91bc3beac4f616c55b8cf15bf41\": rpc error: code = NotFound desc = could not find container \"9b49c8090408d9e333888c3576744b4df5daf91bc3beac4f616c55b8cf15bf41\": container with ID starting with 9b49c8090408d9e333888c3576744b4df5daf91bc3beac4f616c55b8cf15bf41 not found: ID does not exist" Oct 05 21:11:40 crc kubenswrapper[4753]: I1005 21:11:40.281607 4753 scope.go:117] "RemoveContainer" containerID="adbbddc9eeeed625685d9c6aebad7b58fccb009cfe21b9ddbff83e8e2c46f5f4" Oct 05 21:11:40 crc kubenswrapper[4753]: E1005 21:11:40.281782 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adbbddc9eeeed625685d9c6aebad7b58fccb009cfe21b9ddbff83e8e2c46f5f4\": container with ID starting with adbbddc9eeeed625685d9c6aebad7b58fccb009cfe21b9ddbff83e8e2c46f5f4 not found: ID does not exist" containerID="adbbddc9eeeed625685d9c6aebad7b58fccb009cfe21b9ddbff83e8e2c46f5f4" Oct 05 21:11:40 crc kubenswrapper[4753]: I1005 21:11:40.281799 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adbbddc9eeeed625685d9c6aebad7b58fccb009cfe21b9ddbff83e8e2c46f5f4"} err="failed to get container status \"adbbddc9eeeed625685d9c6aebad7b58fccb009cfe21b9ddbff83e8e2c46f5f4\": rpc error: code = NotFound desc = could not find container \"adbbddc9eeeed625685d9c6aebad7b58fccb009cfe21b9ddbff83e8e2c46f5f4\": container with ID starting with adbbddc9eeeed625685d9c6aebad7b58fccb009cfe21b9ddbff83e8e2c46f5f4 not found: ID does not exist" Oct 05 21:11:41 crc kubenswrapper[4753]: I1005 21:11:41.866667 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c45c313b-5228-4055-826f-bd89b3ef23b8" path="/var/lib/kubelet/pods/c45c313b-5228-4055-826f-bd89b3ef23b8/volumes" Oct 05 21:12:04 crc kubenswrapper[4753]: I1005 21:12:04.489698 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:12:04 crc kubenswrapper[4753]: I1005 21:12:04.490230 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.171774 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 05 21:12:26 crc kubenswrapper[4753]: E1005 21:12:26.172927 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c45c313b-5228-4055-826f-bd89b3ef23b8" containerName="extract-content" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.172950 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c45c313b-5228-4055-826f-bd89b3ef23b8" containerName="extract-content" Oct 05 21:12:26 crc kubenswrapper[4753]: E1005 21:12:26.172968 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3836426-5f36-4383-a406-a2dafcf7d424" containerName="extract-utilities" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.172979 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3836426-5f36-4383-a406-a2dafcf7d424" containerName="extract-utilities" Oct 05 21:12:26 crc kubenswrapper[4753]: E1005 21:12:26.172994 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c45c313b-5228-4055-826f-bd89b3ef23b8" containerName="extract-utilities" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.173004 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c45c313b-5228-4055-826f-bd89b3ef23b8" containerName="extract-utilities" Oct 05 21:12:26 crc kubenswrapper[4753]: E1005 21:12:26.173021 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af6bc21-d912-431d-aeae-ddf8eaaf1c86" containerName="registry-server" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.173031 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af6bc21-d912-431d-aeae-ddf8eaaf1c86" containerName="registry-server" Oct 05 21:12:26 crc kubenswrapper[4753]: E1005 21:12:26.173053 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c45c313b-5228-4055-826f-bd89b3ef23b8" containerName="registry-server" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.173064 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c45c313b-5228-4055-826f-bd89b3ef23b8" containerName="registry-server" Oct 05 21:12:26 crc kubenswrapper[4753]: E1005 21:12:26.173085 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3836426-5f36-4383-a406-a2dafcf7d424" containerName="extract-content" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.173095 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3836426-5f36-4383-a406-a2dafcf7d424" containerName="extract-content" Oct 05 21:12:26 crc kubenswrapper[4753]: E1005 21:12:26.173135 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af6bc21-d912-431d-aeae-ddf8eaaf1c86" containerName="extract-utilities" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.173169 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af6bc21-d912-431d-aeae-ddf8eaaf1c86" containerName="extract-utilities" Oct 05 21:12:26 crc kubenswrapper[4753]: E1005 21:12:26.173191 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3836426-5f36-4383-a406-a2dafcf7d424" containerName="registry-server" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.173202 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3836426-5f36-4383-a406-a2dafcf7d424" containerName="registry-server" Oct 05 21:12:26 crc kubenswrapper[4753]: E1005 21:12:26.173229 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af6bc21-d912-431d-aeae-ddf8eaaf1c86" containerName="extract-content" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.173239 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af6bc21-d912-431d-aeae-ddf8eaaf1c86" containerName="extract-content" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.173516 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3836426-5f36-4383-a406-a2dafcf7d424" containerName="registry-server" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.173554 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c45c313b-5228-4055-826f-bd89b3ef23b8" containerName="registry-server" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.173577 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af6bc21-d912-431d-aeae-ddf8eaaf1c86" containerName="registry-server" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.174610 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.177412 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.178970 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-9mbwd" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.179108 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.180528 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.181092 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.232000 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.232062 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.232096 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/989178f4-ef23-49c1-88f8-10babb448a68-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.232119 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.232303 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/989178f4-ef23-49c1-88f8-10babb448a68-config-data\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.232339 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/989178f4-ef23-49c1-88f8-10babb448a68-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.232359 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.232400 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbh6w\" (UniqueName: \"kubernetes.io/projected/989178f4-ef23-49c1-88f8-10babb448a68-kube-api-access-rbh6w\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.232432 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/989178f4-ef23-49c1-88f8-10babb448a68-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.333673 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/989178f4-ef23-49c1-88f8-10babb448a68-config-data\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.333722 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/989178f4-ef23-49c1-88f8-10babb448a68-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.333744 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.333766 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbh6w\" (UniqueName: \"kubernetes.io/projected/989178f4-ef23-49c1-88f8-10babb448a68-kube-api-access-rbh6w\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.333786 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/989178f4-ef23-49c1-88f8-10babb448a68-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.334127 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.334955 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.334989 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/989178f4-ef23-49c1-88f8-10babb448a68-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.335017 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.335292 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.336364 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/989178f4-ef23-49c1-88f8-10babb448a68-config-data\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.339274 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/989178f4-ef23-49c1-88f8-10babb448a68-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.339809 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/989178f4-ef23-49c1-88f8-10babb448a68-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.339922 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/989178f4-ef23-49c1-88f8-10babb448a68-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.341113 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.341685 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.342067 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.351165 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbh6w\" (UniqueName: \"kubernetes.io/projected/989178f4-ef23-49c1-88f8-10babb448a68-kube-api-access-rbh6w\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.369066 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.508488 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 05 21:12:26 crc kubenswrapper[4753]: I1005 21:12:26.951684 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 05 21:12:27 crc kubenswrapper[4753]: I1005 21:12:27.629991 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"989178f4-ef23-49c1-88f8-10babb448a68","Type":"ContainerStarted","Data":"bad8d157c8a6f1cf6cc0f6dc7107c1624dabbf2a7aad7e327280600e469c6794"} Oct 05 21:12:34 crc kubenswrapper[4753]: I1005 21:12:34.490618 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:12:34 crc kubenswrapper[4753]: I1005 21:12:34.490913 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:12:34 crc kubenswrapper[4753]: I1005 21:12:34.490963 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 21:12:34 crc kubenswrapper[4753]: I1005 21:12:34.491811 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d50785440eb66ddc884cdaacfb40867533221c53067d0d7ab2326a58245174c"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 21:12:34 crc kubenswrapper[4753]: I1005 21:12:34.491865 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://8d50785440eb66ddc884cdaacfb40867533221c53067d0d7ab2326a58245174c" gracePeriod=600 Oct 05 21:12:34 crc kubenswrapper[4753]: I1005 21:12:34.700975 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="8d50785440eb66ddc884cdaacfb40867533221c53067d0d7ab2326a58245174c" exitCode=0 Oct 05 21:12:34 crc kubenswrapper[4753]: I1005 21:12:34.701022 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"8d50785440eb66ddc884cdaacfb40867533221c53067d0d7ab2326a58245174c"} Oct 05 21:12:34 crc kubenswrapper[4753]: I1005 21:12:34.701066 4753 scope.go:117] "RemoveContainer" containerID="872b48d57fab3b11cc6fc2651aacae288127069c355ab2cc8b7e166a1408294a" Oct 05 21:12:35 crc kubenswrapper[4753]: I1005 21:12:35.711183 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8"} Oct 05 21:13:00 crc kubenswrapper[4753]: E1005 21:13:00.128890 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 05 21:13:00 crc kubenswrapper[4753]: E1005 21:13:00.132782 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rbh6w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(989178f4-ef23-49c1-88f8-10babb448a68): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 05 21:13:00 crc kubenswrapper[4753]: E1005 21:13:00.134259 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="989178f4-ef23-49c1-88f8-10babb448a68" Oct 05 21:13:00 crc kubenswrapper[4753]: E1005 21:13:00.984011 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="989178f4-ef23-49c1-88f8-10babb448a68" Oct 05 21:13:15 crc kubenswrapper[4753]: I1005 21:13:15.282992 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 05 21:13:17 crc kubenswrapper[4753]: I1005 21:13:17.136743 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"989178f4-ef23-49c1-88f8-10babb448a68","Type":"ContainerStarted","Data":"7f2b15978fdaced233b753f9a843ccd14297dacb0ae85b887d500fd5c08dd619"} Oct 05 21:13:17 crc kubenswrapper[4753]: I1005 21:13:17.168001 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.847645151 podStartE2EDuration="52.167981396s" podCreationTimestamp="2025-10-05 21:12:25 +0000 UTC" firstStartedPulling="2025-10-05 21:12:26.959949151 +0000 UTC m=+3455.808277383" lastFinishedPulling="2025-10-05 21:13:15.280285396 +0000 UTC m=+3504.128613628" observedRunningTime="2025-10-05 21:13:17.167553163 +0000 UTC m=+3506.015881395" watchObservedRunningTime="2025-10-05 21:13:17.167981396 +0000 UTC m=+3506.016309638" Oct 05 21:15:00 crc kubenswrapper[4753]: I1005 21:15:00.182110 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp"] Oct 05 21:15:00 crc kubenswrapper[4753]: I1005 21:15:00.186567 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" Oct 05 21:15:00 crc kubenswrapper[4753]: I1005 21:15:00.188904 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 05 21:15:00 crc kubenswrapper[4753]: I1005 21:15:00.189264 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 05 21:15:00 crc kubenswrapper[4753]: I1005 21:15:00.196736 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp"] Oct 05 21:15:00 crc kubenswrapper[4753]: I1005 21:15:00.251873 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-secret-volume\") pod \"collect-profiles-29328315-tvtcp\" (UID: \"806e9fa5-1f52-4bd8-979d-cf16ff943ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" Oct 05 21:15:00 crc kubenswrapper[4753]: I1005 21:15:00.252053 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv9xn\" (UniqueName: \"kubernetes.io/projected/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-kube-api-access-gv9xn\") pod \"collect-profiles-29328315-tvtcp\" (UID: \"806e9fa5-1f52-4bd8-979d-cf16ff943ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" Oct 05 21:15:00 crc kubenswrapper[4753]: I1005 21:15:00.252106 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-config-volume\") pod \"collect-profiles-29328315-tvtcp\" (UID: \"806e9fa5-1f52-4bd8-979d-cf16ff943ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" Oct 05 21:15:00 crc kubenswrapper[4753]: I1005 21:15:00.353756 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv9xn\" (UniqueName: \"kubernetes.io/projected/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-kube-api-access-gv9xn\") pod \"collect-profiles-29328315-tvtcp\" (UID: \"806e9fa5-1f52-4bd8-979d-cf16ff943ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" Oct 05 21:15:00 crc kubenswrapper[4753]: I1005 21:15:00.353821 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-config-volume\") pod \"collect-profiles-29328315-tvtcp\" (UID: \"806e9fa5-1f52-4bd8-979d-cf16ff943ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" Oct 05 21:15:00 crc kubenswrapper[4753]: I1005 21:15:00.353895 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-secret-volume\") pod \"collect-profiles-29328315-tvtcp\" (UID: \"806e9fa5-1f52-4bd8-979d-cf16ff943ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" Oct 05 21:15:00 crc kubenswrapper[4753]: I1005 21:15:00.354773 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-config-volume\") pod \"collect-profiles-29328315-tvtcp\" (UID: \"806e9fa5-1f52-4bd8-979d-cf16ff943ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" Oct 05 21:15:00 crc kubenswrapper[4753]: I1005 21:15:00.367390 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-secret-volume\") pod \"collect-profiles-29328315-tvtcp\" (UID: \"806e9fa5-1f52-4bd8-979d-cf16ff943ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" Oct 05 21:15:00 crc kubenswrapper[4753]: I1005 21:15:00.373997 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv9xn\" (UniqueName: \"kubernetes.io/projected/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-kube-api-access-gv9xn\") pod \"collect-profiles-29328315-tvtcp\" (UID: \"806e9fa5-1f52-4bd8-979d-cf16ff943ec4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" Oct 05 21:15:00 crc kubenswrapper[4753]: I1005 21:15:00.511012 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" Oct 05 21:15:00 crc kubenswrapper[4753]: I1005 21:15:00.983519 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp"] Oct 05 21:15:01 crc kubenswrapper[4753]: I1005 21:15:01.185130 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" event={"ID":"806e9fa5-1f52-4bd8-979d-cf16ff943ec4","Type":"ContainerStarted","Data":"6abba98b26f8bac12879c16c44ce631f116deb20f41d06e74cb95ed95f01251f"} Oct 05 21:15:01 crc kubenswrapper[4753]: I1005 21:15:01.185189 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" event={"ID":"806e9fa5-1f52-4bd8-979d-cf16ff943ec4","Type":"ContainerStarted","Data":"03ba2d39f11d53f2ec5fd6b7dffed4dafd08f18a701d91dd0ddf9d12898dc4f9"} Oct 05 21:15:01 crc kubenswrapper[4753]: I1005 21:15:01.209318 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" podStartSLOduration=1.209297082 podStartE2EDuration="1.209297082s" podCreationTimestamp="2025-10-05 21:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 21:15:01.204454573 +0000 UTC m=+3610.052782825" watchObservedRunningTime="2025-10-05 21:15:01.209297082 +0000 UTC m=+3610.057625314" Oct 05 21:15:02 crc kubenswrapper[4753]: I1005 21:15:02.194775 4753 generic.go:334] "Generic (PLEG): container finished" podID="806e9fa5-1f52-4bd8-979d-cf16ff943ec4" containerID="6abba98b26f8bac12879c16c44ce631f116deb20f41d06e74cb95ed95f01251f" exitCode=0 Oct 05 21:15:02 crc kubenswrapper[4753]: I1005 21:15:02.194879 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" event={"ID":"806e9fa5-1f52-4bd8-979d-cf16ff943ec4","Type":"ContainerDied","Data":"6abba98b26f8bac12879c16c44ce631f116deb20f41d06e74cb95ed95f01251f"} Oct 05 21:15:03 crc kubenswrapper[4753]: I1005 21:15:03.567386 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" Oct 05 21:15:03 crc kubenswrapper[4753]: I1005 21:15:03.627103 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv9xn\" (UniqueName: \"kubernetes.io/projected/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-kube-api-access-gv9xn\") pod \"806e9fa5-1f52-4bd8-979d-cf16ff943ec4\" (UID: \"806e9fa5-1f52-4bd8-979d-cf16ff943ec4\") " Oct 05 21:15:03 crc kubenswrapper[4753]: I1005 21:15:03.627426 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-config-volume\") pod \"806e9fa5-1f52-4bd8-979d-cf16ff943ec4\" (UID: \"806e9fa5-1f52-4bd8-979d-cf16ff943ec4\") " Oct 05 21:15:03 crc kubenswrapper[4753]: I1005 21:15:03.627486 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-secret-volume\") pod \"806e9fa5-1f52-4bd8-979d-cf16ff943ec4\" (UID: \"806e9fa5-1f52-4bd8-979d-cf16ff943ec4\") " Oct 05 21:15:03 crc kubenswrapper[4753]: I1005 21:15:03.628202 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-config-volume" (OuterVolumeSpecName: "config-volume") pod "806e9fa5-1f52-4bd8-979d-cf16ff943ec4" (UID: "806e9fa5-1f52-4bd8-979d-cf16ff943ec4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:15:03 crc kubenswrapper[4753]: I1005 21:15:03.633610 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-kube-api-access-gv9xn" (OuterVolumeSpecName: "kube-api-access-gv9xn") pod "806e9fa5-1f52-4bd8-979d-cf16ff943ec4" (UID: "806e9fa5-1f52-4bd8-979d-cf16ff943ec4"). InnerVolumeSpecName "kube-api-access-gv9xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:15:03 crc kubenswrapper[4753]: I1005 21:15:03.634310 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "806e9fa5-1f52-4bd8-979d-cf16ff943ec4" (UID: "806e9fa5-1f52-4bd8-979d-cf16ff943ec4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:15:03 crc kubenswrapper[4753]: I1005 21:15:03.737274 4753 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-config-volume\") on node \"crc\" DevicePath \"\"" Oct 05 21:15:03 crc kubenswrapper[4753]: I1005 21:15:03.737307 4753 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 05 21:15:03 crc kubenswrapper[4753]: I1005 21:15:03.737320 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv9xn\" (UniqueName: \"kubernetes.io/projected/806e9fa5-1f52-4bd8-979d-cf16ff943ec4-kube-api-access-gv9xn\") on node \"crc\" DevicePath \"\"" Oct 05 21:15:04 crc kubenswrapper[4753]: I1005 21:15:04.210530 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" event={"ID":"806e9fa5-1f52-4bd8-979d-cf16ff943ec4","Type":"ContainerDied","Data":"03ba2d39f11d53f2ec5fd6b7dffed4dafd08f18a701d91dd0ddf9d12898dc4f9"} Oct 05 21:15:04 crc kubenswrapper[4753]: I1005 21:15:04.210790 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03ba2d39f11d53f2ec5fd6b7dffed4dafd08f18a701d91dd0ddf9d12898dc4f9" Oct 05 21:15:04 crc kubenswrapper[4753]: I1005 21:15:04.210602 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328315-tvtcp" Oct 05 21:15:04 crc kubenswrapper[4753]: I1005 21:15:04.285476 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99"] Oct 05 21:15:04 crc kubenswrapper[4753]: I1005 21:15:04.295088 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328270-p4s99"] Oct 05 21:15:04 crc kubenswrapper[4753]: I1005 21:15:04.489946 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:15:04 crc kubenswrapper[4753]: I1005 21:15:04.490020 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:15:05 crc kubenswrapper[4753]: I1005 21:15:05.872721 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f15f854-d530-42ad-a821-537981a408e3" path="/var/lib/kubelet/pods/3f15f854-d530-42ad-a821-537981a408e3/volumes" Oct 05 21:15:34 crc kubenswrapper[4753]: I1005 21:15:34.490346 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:15:34 crc kubenswrapper[4753]: I1005 21:15:34.490980 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:16:04 crc kubenswrapper[4753]: I1005 21:16:04.490396 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:16:04 crc kubenswrapper[4753]: I1005 21:16:04.491221 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:16:04 crc kubenswrapper[4753]: I1005 21:16:04.491299 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 21:16:04 crc kubenswrapper[4753]: I1005 21:16:04.492526 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 21:16:04 crc kubenswrapper[4753]: I1005 21:16:04.492631 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" gracePeriod=600 Oct 05 21:16:04 crc kubenswrapper[4753]: E1005 21:16:04.623396 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:16:04 crc kubenswrapper[4753]: I1005 21:16:04.778314 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" exitCode=0 Oct 05 21:16:04 crc kubenswrapper[4753]: I1005 21:16:04.778361 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8"} Oct 05 21:16:04 crc kubenswrapper[4753]: I1005 21:16:04.778392 4753 scope.go:117] "RemoveContainer" containerID="8d50785440eb66ddc884cdaacfb40867533221c53067d0d7ab2326a58245174c" Oct 05 21:16:04 crc kubenswrapper[4753]: I1005 21:16:04.779287 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:16:04 crc kubenswrapper[4753]: E1005 21:16:04.779816 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:16:05 crc kubenswrapper[4753]: I1005 21:16:05.290363 4753 scope.go:117] "RemoveContainer" containerID="98a7c79a6806a609d3663be2d9b211ac912550218710d03e40ec5f75ceccedb4" Oct 05 21:16:16 crc kubenswrapper[4753]: I1005 21:16:16.852975 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:16:16 crc kubenswrapper[4753]: E1005 21:16:16.853717 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:16:28 crc kubenswrapper[4753]: I1005 21:16:28.853279 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:16:28 crc kubenswrapper[4753]: E1005 21:16:28.855403 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:16:42 crc kubenswrapper[4753]: I1005 21:16:42.852430 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:16:42 crc kubenswrapper[4753]: E1005 21:16:42.853372 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:16:56 crc kubenswrapper[4753]: I1005 21:16:56.852010 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:16:56 crc kubenswrapper[4753]: E1005 21:16:56.852841 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:17:05 crc kubenswrapper[4753]: I1005 21:17:05.375352 4753 scope.go:117] "RemoveContainer" containerID="0d911b9d5c07459b55760e6dad72371caa1698a5e45a7485365b02ad5a5779a0" Oct 05 21:17:05 crc kubenswrapper[4753]: I1005 21:17:05.401768 4753 scope.go:117] "RemoveContainer" containerID="4c91d52ec12a3193146c1d03144a6efb0904a8040935097cd50f721bfb459a3c" Oct 05 21:17:05 crc kubenswrapper[4753]: I1005 21:17:05.425659 4753 scope.go:117] "RemoveContainer" containerID="b1499066a2f6f14c1b9e94baad00c8192ff51c0e7adf21bd41431fdc24ebd7f6" Oct 05 21:17:05 crc kubenswrapper[4753]: I1005 21:17:05.448111 4753 scope.go:117] "RemoveContainer" containerID="cfd96578c8aa2bb625423365c4ecd52e9795850174928395e6f2e088e0190c73" Oct 05 21:17:08 crc kubenswrapper[4753]: I1005 21:17:08.852213 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:17:08 crc kubenswrapper[4753]: E1005 21:17:08.853070 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:17:22 crc kubenswrapper[4753]: I1005 21:17:22.852210 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:17:22 crc kubenswrapper[4753]: E1005 21:17:22.853002 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:17:36 crc kubenswrapper[4753]: I1005 21:17:36.852178 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:17:36 crc kubenswrapper[4753]: E1005 21:17:36.853426 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:17:47 crc kubenswrapper[4753]: I1005 21:17:47.852400 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:17:47 crc kubenswrapper[4753]: E1005 21:17:47.853275 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:18:01 crc kubenswrapper[4753]: I1005 21:18:01.861591 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:18:01 crc kubenswrapper[4753]: E1005 21:18:01.862478 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:18:13 crc kubenswrapper[4753]: I1005 21:18:13.852716 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:18:13 crc kubenswrapper[4753]: E1005 21:18:13.853766 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:18:27 crc kubenswrapper[4753]: I1005 21:18:27.852486 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:18:27 crc kubenswrapper[4753]: E1005 21:18:27.853142 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:18:41 crc kubenswrapper[4753]: I1005 21:18:41.858869 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:18:41 crc kubenswrapper[4753]: E1005 21:18:41.859783 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:18:54 crc kubenswrapper[4753]: I1005 21:18:54.853245 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:18:54 crc kubenswrapper[4753]: E1005 21:18:54.854105 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:19:09 crc kubenswrapper[4753]: I1005 21:19:09.851688 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:19:09 crc kubenswrapper[4753]: E1005 21:19:09.852304 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:19:22 crc kubenswrapper[4753]: I1005 21:19:22.852303 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:19:22 crc kubenswrapper[4753]: E1005 21:19:22.853048 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:19:34 crc kubenswrapper[4753]: I1005 21:19:34.852795 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:19:34 crc kubenswrapper[4753]: E1005 21:19:34.853723 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:19:45 crc kubenswrapper[4753]: I1005 21:19:45.852467 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:19:45 crc kubenswrapper[4753]: E1005 21:19:45.853422 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:19:46 crc kubenswrapper[4753]: I1005 21:19:46.055445 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-g89nn"] Oct 05 21:19:46 crc kubenswrapper[4753]: I1005 21:19:46.070916 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-g89nn"] Oct 05 21:19:47 crc kubenswrapper[4753]: I1005 21:19:47.865987 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9de08ec5-cac4-4f6b-8b34-e63f0d613e00" path="/var/lib/kubelet/pods/9de08ec5-cac4-4f6b-8b34-e63f0d613e00/volumes" Oct 05 21:19:58 crc kubenswrapper[4753]: I1005 21:19:58.852549 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:19:58 crc kubenswrapper[4753]: E1005 21:19:58.855155 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:20:01 crc kubenswrapper[4753]: I1005 21:20:01.035028 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-b5a7-account-create-k477x"] Oct 05 21:20:01 crc kubenswrapper[4753]: I1005 21:20:01.049276 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-b5a7-account-create-k477x"] Oct 05 21:20:01 crc kubenswrapper[4753]: I1005 21:20:01.863438 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bf632a4-72e3-4567-b353-bf19ee21c255" path="/var/lib/kubelet/pods/5bf632a4-72e3-4567-b353-bf19ee21c255/volumes" Oct 05 21:20:05 crc kubenswrapper[4753]: I1005 21:20:05.592243 4753 scope.go:117] "RemoveContainer" containerID="4a127b0b22bf63d7a7f83d1320a634418650643cde7be0da90306e706633e7d1" Oct 05 21:20:05 crc kubenswrapper[4753]: I1005 21:20:05.638745 4753 scope.go:117] "RemoveContainer" containerID="4ffd383613139a6b942451b9eed5aebd5c0764ae6ba72b97519fcaeac9d4d2b4" Oct 05 21:20:12 crc kubenswrapper[4753]: I1005 21:20:12.853113 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:20:12 crc kubenswrapper[4753]: E1005 21:20:12.853867 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:20:25 crc kubenswrapper[4753]: I1005 21:20:25.053972 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-k5rlq"] Oct 05 21:20:25 crc kubenswrapper[4753]: I1005 21:20:25.067885 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-k5rlq"] Oct 05 21:20:25 crc kubenswrapper[4753]: I1005 21:20:25.863864 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31365694-05a6-4386-98db-b2054a6464f4" path="/var/lib/kubelet/pods/31365694-05a6-4386-98db-b2054a6464f4/volumes" Oct 05 21:20:26 crc kubenswrapper[4753]: I1005 21:20:26.852118 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:20:26 crc kubenswrapper[4753]: E1005 21:20:26.852992 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:20:41 crc kubenswrapper[4753]: I1005 21:20:41.865740 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:20:41 crc kubenswrapper[4753]: E1005 21:20:41.866543 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:20:52 crc kubenswrapper[4753]: I1005 21:20:52.852337 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:20:52 crc kubenswrapper[4753]: E1005 21:20:52.853164 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:21:01 crc kubenswrapper[4753]: I1005 21:21:01.557959 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7s5hm"] Oct 05 21:21:01 crc kubenswrapper[4753]: E1005 21:21:01.559053 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="806e9fa5-1f52-4bd8-979d-cf16ff943ec4" containerName="collect-profiles" Oct 05 21:21:01 crc kubenswrapper[4753]: I1005 21:21:01.559070 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="806e9fa5-1f52-4bd8-979d-cf16ff943ec4" containerName="collect-profiles" Oct 05 21:21:01 crc kubenswrapper[4753]: I1005 21:21:01.559359 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="806e9fa5-1f52-4bd8-979d-cf16ff943ec4" containerName="collect-profiles" Oct 05 21:21:01 crc kubenswrapper[4753]: I1005 21:21:01.561302 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:01 crc kubenswrapper[4753]: I1005 21:21:01.579388 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7s5hm"] Oct 05 21:21:01 crc kubenswrapper[4753]: I1005 21:21:01.707425 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxbld\" (UniqueName: \"kubernetes.io/projected/804fab32-c45f-406e-943c-0a5372337e5f-kube-api-access-kxbld\") pod \"community-operators-7s5hm\" (UID: \"804fab32-c45f-406e-943c-0a5372337e5f\") " pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:01 crc kubenswrapper[4753]: I1005 21:21:01.707807 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804fab32-c45f-406e-943c-0a5372337e5f-catalog-content\") pod \"community-operators-7s5hm\" (UID: \"804fab32-c45f-406e-943c-0a5372337e5f\") " pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:01 crc kubenswrapper[4753]: I1005 21:21:01.707841 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804fab32-c45f-406e-943c-0a5372337e5f-utilities\") pod \"community-operators-7s5hm\" (UID: \"804fab32-c45f-406e-943c-0a5372337e5f\") " pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:01 crc kubenswrapper[4753]: I1005 21:21:01.810737 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxbld\" (UniqueName: \"kubernetes.io/projected/804fab32-c45f-406e-943c-0a5372337e5f-kube-api-access-kxbld\") pod \"community-operators-7s5hm\" (UID: \"804fab32-c45f-406e-943c-0a5372337e5f\") " pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:01 crc kubenswrapper[4753]: I1005 21:21:01.810828 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804fab32-c45f-406e-943c-0a5372337e5f-catalog-content\") pod \"community-operators-7s5hm\" (UID: \"804fab32-c45f-406e-943c-0a5372337e5f\") " pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:01 crc kubenswrapper[4753]: I1005 21:21:01.810882 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804fab32-c45f-406e-943c-0a5372337e5f-utilities\") pod \"community-operators-7s5hm\" (UID: \"804fab32-c45f-406e-943c-0a5372337e5f\") " pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:01 crc kubenswrapper[4753]: I1005 21:21:01.811546 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804fab32-c45f-406e-943c-0a5372337e5f-catalog-content\") pod \"community-operators-7s5hm\" (UID: \"804fab32-c45f-406e-943c-0a5372337e5f\") " pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:01 crc kubenswrapper[4753]: I1005 21:21:01.811591 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804fab32-c45f-406e-943c-0a5372337e5f-utilities\") pod \"community-operators-7s5hm\" (UID: \"804fab32-c45f-406e-943c-0a5372337e5f\") " pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:01 crc kubenswrapper[4753]: I1005 21:21:01.837845 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxbld\" (UniqueName: \"kubernetes.io/projected/804fab32-c45f-406e-943c-0a5372337e5f-kube-api-access-kxbld\") pod \"community-operators-7s5hm\" (UID: \"804fab32-c45f-406e-943c-0a5372337e5f\") " pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:01 crc kubenswrapper[4753]: I1005 21:21:01.917994 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:02 crc kubenswrapper[4753]: I1005 21:21:02.756017 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7s5hm"] Oct 05 21:21:03 crc kubenswrapper[4753]: I1005 21:21:03.694737 4753 generic.go:334] "Generic (PLEG): container finished" podID="804fab32-c45f-406e-943c-0a5372337e5f" containerID="b4f94dd75006a3811f23a13a3303cce85b2f74e1a1a935a114bb7f031ea3f32a" exitCode=0 Oct 05 21:21:03 crc kubenswrapper[4753]: I1005 21:21:03.694877 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7s5hm" event={"ID":"804fab32-c45f-406e-943c-0a5372337e5f","Type":"ContainerDied","Data":"b4f94dd75006a3811f23a13a3303cce85b2f74e1a1a935a114bb7f031ea3f32a"} Oct 05 21:21:03 crc kubenswrapper[4753]: I1005 21:21:03.695248 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7s5hm" event={"ID":"804fab32-c45f-406e-943c-0a5372337e5f","Type":"ContainerStarted","Data":"7c5704b8a49c4ace3012cc09f193d93a14213842c869e764753b4bae18748158"} Oct 05 21:21:03 crc kubenswrapper[4753]: I1005 21:21:03.698321 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 05 21:21:04 crc kubenswrapper[4753]: I1005 21:21:04.707254 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7s5hm" event={"ID":"804fab32-c45f-406e-943c-0a5372337e5f","Type":"ContainerStarted","Data":"2a88e3206ebcf03c93ef6fc4a57a3ca49783a33ae48556a1a71d2fd6079051d9"} Oct 05 21:21:04 crc kubenswrapper[4753]: I1005 21:21:04.852372 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:21:05 crc kubenswrapper[4753]: I1005 21:21:05.724768 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"cf5b3dbe67258318b8270a000eb7f71dacdc8fd82ceca6b74fc564bc02d85d55"} Oct 05 21:21:05 crc kubenswrapper[4753]: I1005 21:21:05.814162 4753 scope.go:117] "RemoveContainer" containerID="12db8e1d570e1e8ffac739b0f8836070f44c7198293bce0878f2d0ec9414634d" Oct 05 21:21:06 crc kubenswrapper[4753]: I1005 21:21:06.737480 4753 generic.go:334] "Generic (PLEG): container finished" podID="804fab32-c45f-406e-943c-0a5372337e5f" containerID="2a88e3206ebcf03c93ef6fc4a57a3ca49783a33ae48556a1a71d2fd6079051d9" exitCode=0 Oct 05 21:21:06 crc kubenswrapper[4753]: I1005 21:21:06.737599 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7s5hm" event={"ID":"804fab32-c45f-406e-943c-0a5372337e5f","Type":"ContainerDied","Data":"2a88e3206ebcf03c93ef6fc4a57a3ca49783a33ae48556a1a71d2fd6079051d9"} Oct 05 21:21:07 crc kubenswrapper[4753]: I1005 21:21:07.748600 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7s5hm" event={"ID":"804fab32-c45f-406e-943c-0a5372337e5f","Type":"ContainerStarted","Data":"1f4abcf1e6d92f86e84e2bde0d88bc1d58d2a3a9ed84a5e8113b19ba30135ed4"} Oct 05 21:21:11 crc kubenswrapper[4753]: I1005 21:21:11.919269 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:11 crc kubenswrapper[4753]: I1005 21:21:11.919755 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:12 crc kubenswrapper[4753]: I1005 21:21:12.962349 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-7s5hm" podUID="804fab32-c45f-406e-943c-0a5372337e5f" containerName="registry-server" probeResult="failure" output=< Oct 05 21:21:12 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 21:21:12 crc kubenswrapper[4753]: > Oct 05 21:21:21 crc kubenswrapper[4753]: I1005 21:21:21.964453 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:21 crc kubenswrapper[4753]: I1005 21:21:21.989882 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7s5hm" podStartSLOduration=17.382222252 podStartE2EDuration="20.989865353s" podCreationTimestamp="2025-10-05 21:21:01 +0000 UTC" firstStartedPulling="2025-10-05 21:21:03.697952054 +0000 UTC m=+3972.546280296" lastFinishedPulling="2025-10-05 21:21:07.305595125 +0000 UTC m=+3976.153923397" observedRunningTime="2025-10-05 21:21:07.76926747 +0000 UTC m=+3976.617595722" watchObservedRunningTime="2025-10-05 21:21:21.989865353 +0000 UTC m=+3990.838193585" Oct 05 21:21:22 crc kubenswrapper[4753]: I1005 21:21:22.012831 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:22 crc kubenswrapper[4753]: I1005 21:21:22.200637 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7s5hm"] Oct 05 21:21:23 crc kubenswrapper[4753]: I1005 21:21:23.878976 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7s5hm" podUID="804fab32-c45f-406e-943c-0a5372337e5f" containerName="registry-server" containerID="cri-o://1f4abcf1e6d92f86e84e2bde0d88bc1d58d2a3a9ed84a5e8113b19ba30135ed4" gracePeriod=2 Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.497191 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.666768 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804fab32-c45f-406e-943c-0a5372337e5f-catalog-content\") pod \"804fab32-c45f-406e-943c-0a5372337e5f\" (UID: \"804fab32-c45f-406e-943c-0a5372337e5f\") " Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.667046 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804fab32-c45f-406e-943c-0a5372337e5f-utilities\") pod \"804fab32-c45f-406e-943c-0a5372337e5f\" (UID: \"804fab32-c45f-406e-943c-0a5372337e5f\") " Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.667082 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxbld\" (UniqueName: \"kubernetes.io/projected/804fab32-c45f-406e-943c-0a5372337e5f-kube-api-access-kxbld\") pod \"804fab32-c45f-406e-943c-0a5372337e5f\" (UID: \"804fab32-c45f-406e-943c-0a5372337e5f\") " Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.667636 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804fab32-c45f-406e-943c-0a5372337e5f-utilities" (OuterVolumeSpecName: "utilities") pod "804fab32-c45f-406e-943c-0a5372337e5f" (UID: "804fab32-c45f-406e-943c-0a5372337e5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.692197 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804fab32-c45f-406e-943c-0a5372337e5f-kube-api-access-kxbld" (OuterVolumeSpecName: "kube-api-access-kxbld") pod "804fab32-c45f-406e-943c-0a5372337e5f" (UID: "804fab32-c45f-406e-943c-0a5372337e5f"). InnerVolumeSpecName "kube-api-access-kxbld". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.734837 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804fab32-c45f-406e-943c-0a5372337e5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "804fab32-c45f-406e-943c-0a5372337e5f" (UID: "804fab32-c45f-406e-943c-0a5372337e5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.769239 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804fab32-c45f-406e-943c-0a5372337e5f-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.769492 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxbld\" (UniqueName: \"kubernetes.io/projected/804fab32-c45f-406e-943c-0a5372337e5f-kube-api-access-kxbld\") on node \"crc\" DevicePath \"\"" Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.769554 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804fab32-c45f-406e-943c-0a5372337e5f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.887855 4753 generic.go:334] "Generic (PLEG): container finished" podID="804fab32-c45f-406e-943c-0a5372337e5f" containerID="1f4abcf1e6d92f86e84e2bde0d88bc1d58d2a3a9ed84a5e8113b19ba30135ed4" exitCode=0 Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.887923 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7s5hm" Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.887940 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7s5hm" event={"ID":"804fab32-c45f-406e-943c-0a5372337e5f","Type":"ContainerDied","Data":"1f4abcf1e6d92f86e84e2bde0d88bc1d58d2a3a9ed84a5e8113b19ba30135ed4"} Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.889067 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7s5hm" event={"ID":"804fab32-c45f-406e-943c-0a5372337e5f","Type":"ContainerDied","Data":"7c5704b8a49c4ace3012cc09f193d93a14213842c869e764753b4bae18748158"} Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.889089 4753 scope.go:117] "RemoveContainer" containerID="1f4abcf1e6d92f86e84e2bde0d88bc1d58d2a3a9ed84a5e8113b19ba30135ed4" Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.920395 4753 scope.go:117] "RemoveContainer" containerID="2a88e3206ebcf03c93ef6fc4a57a3ca49783a33ae48556a1a71d2fd6079051d9" Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.921321 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7s5hm"] Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.941066 4753 scope.go:117] "RemoveContainer" containerID="b4f94dd75006a3811f23a13a3303cce85b2f74e1a1a935a114bb7f031ea3f32a" Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.942818 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7s5hm"] Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.979234 4753 scope.go:117] "RemoveContainer" containerID="1f4abcf1e6d92f86e84e2bde0d88bc1d58d2a3a9ed84a5e8113b19ba30135ed4" Oct 05 21:21:24 crc kubenswrapper[4753]: E1005 21:21:24.980249 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f4abcf1e6d92f86e84e2bde0d88bc1d58d2a3a9ed84a5e8113b19ba30135ed4\": container with ID starting with 1f4abcf1e6d92f86e84e2bde0d88bc1d58d2a3a9ed84a5e8113b19ba30135ed4 not found: ID does not exist" containerID="1f4abcf1e6d92f86e84e2bde0d88bc1d58d2a3a9ed84a5e8113b19ba30135ed4" Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.980307 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4abcf1e6d92f86e84e2bde0d88bc1d58d2a3a9ed84a5e8113b19ba30135ed4"} err="failed to get container status \"1f4abcf1e6d92f86e84e2bde0d88bc1d58d2a3a9ed84a5e8113b19ba30135ed4\": rpc error: code = NotFound desc = could not find container \"1f4abcf1e6d92f86e84e2bde0d88bc1d58d2a3a9ed84a5e8113b19ba30135ed4\": container with ID starting with 1f4abcf1e6d92f86e84e2bde0d88bc1d58d2a3a9ed84a5e8113b19ba30135ed4 not found: ID does not exist" Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.980339 4753 scope.go:117] "RemoveContainer" containerID="2a88e3206ebcf03c93ef6fc4a57a3ca49783a33ae48556a1a71d2fd6079051d9" Oct 05 21:21:24 crc kubenswrapper[4753]: E1005 21:21:24.980953 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a88e3206ebcf03c93ef6fc4a57a3ca49783a33ae48556a1a71d2fd6079051d9\": container with ID starting with 2a88e3206ebcf03c93ef6fc4a57a3ca49783a33ae48556a1a71d2fd6079051d9 not found: ID does not exist" containerID="2a88e3206ebcf03c93ef6fc4a57a3ca49783a33ae48556a1a71d2fd6079051d9" Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.980993 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a88e3206ebcf03c93ef6fc4a57a3ca49783a33ae48556a1a71d2fd6079051d9"} err="failed to get container status \"2a88e3206ebcf03c93ef6fc4a57a3ca49783a33ae48556a1a71d2fd6079051d9\": rpc error: code = NotFound desc = could not find container \"2a88e3206ebcf03c93ef6fc4a57a3ca49783a33ae48556a1a71d2fd6079051d9\": container with ID starting with 2a88e3206ebcf03c93ef6fc4a57a3ca49783a33ae48556a1a71d2fd6079051d9 not found: ID does not exist" Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.981022 4753 scope.go:117] "RemoveContainer" containerID="b4f94dd75006a3811f23a13a3303cce85b2f74e1a1a935a114bb7f031ea3f32a" Oct 05 21:21:24 crc kubenswrapper[4753]: E1005 21:21:24.981392 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4f94dd75006a3811f23a13a3303cce85b2f74e1a1a935a114bb7f031ea3f32a\": container with ID starting with b4f94dd75006a3811f23a13a3303cce85b2f74e1a1a935a114bb7f031ea3f32a not found: ID does not exist" containerID="b4f94dd75006a3811f23a13a3303cce85b2f74e1a1a935a114bb7f031ea3f32a" Oct 05 21:21:24 crc kubenswrapper[4753]: I1005 21:21:24.981420 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f94dd75006a3811f23a13a3303cce85b2f74e1a1a935a114bb7f031ea3f32a"} err="failed to get container status \"b4f94dd75006a3811f23a13a3303cce85b2f74e1a1a935a114bb7f031ea3f32a\": rpc error: code = NotFound desc = could not find container \"b4f94dd75006a3811f23a13a3303cce85b2f74e1a1a935a114bb7f031ea3f32a\": container with ID starting with b4f94dd75006a3811f23a13a3303cce85b2f74e1a1a935a114bb7f031ea3f32a not found: ID does not exist" Oct 05 21:21:25 crc kubenswrapper[4753]: I1005 21:21:25.862711 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804fab32-c45f-406e-943c-0a5372337e5f" path="/var/lib/kubelet/pods/804fab32-c45f-406e-943c-0a5372337e5f/volumes" Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.412161 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nn44x"] Oct 05 21:21:27 crc kubenswrapper[4753]: E1005 21:21:27.412548 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804fab32-c45f-406e-943c-0a5372337e5f" containerName="extract-content" Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.412560 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="804fab32-c45f-406e-943c-0a5372337e5f" containerName="extract-content" Oct 05 21:21:27 crc kubenswrapper[4753]: E1005 21:21:27.412576 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804fab32-c45f-406e-943c-0a5372337e5f" containerName="registry-server" Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.412582 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="804fab32-c45f-406e-943c-0a5372337e5f" containerName="registry-server" Oct 05 21:21:27 crc kubenswrapper[4753]: E1005 21:21:27.412599 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804fab32-c45f-406e-943c-0a5372337e5f" containerName="extract-utilities" Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.412604 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="804fab32-c45f-406e-943c-0a5372337e5f" containerName="extract-utilities" Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.412774 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="804fab32-c45f-406e-943c-0a5372337e5f" containerName="registry-server" Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.414057 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.428400 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nn44x"] Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.513939 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-catalog-content\") pod \"certified-operators-nn44x\" (UID: \"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47\") " pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.513997 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-utilities\") pod \"certified-operators-nn44x\" (UID: \"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47\") " pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.514512 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwjlv\" (UniqueName: \"kubernetes.io/projected/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-kube-api-access-xwjlv\") pod \"certified-operators-nn44x\" (UID: \"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47\") " pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.616182 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwjlv\" (UniqueName: \"kubernetes.io/projected/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-kube-api-access-xwjlv\") pod \"certified-operators-nn44x\" (UID: \"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47\") " pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.616250 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-catalog-content\") pod \"certified-operators-nn44x\" (UID: \"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47\") " pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.616289 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-utilities\") pod \"certified-operators-nn44x\" (UID: \"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47\") " pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.616852 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-utilities\") pod \"certified-operators-nn44x\" (UID: \"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47\") " pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.616920 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-catalog-content\") pod \"certified-operators-nn44x\" (UID: \"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47\") " pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.633708 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwjlv\" (UniqueName: \"kubernetes.io/projected/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-kube-api-access-xwjlv\") pod \"certified-operators-nn44x\" (UID: \"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47\") " pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:27 crc kubenswrapper[4753]: I1005 21:21:27.732330 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:28 crc kubenswrapper[4753]: I1005 21:21:28.245517 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nn44x"] Oct 05 21:21:28 crc kubenswrapper[4753]: I1005 21:21:28.924413 4753 generic.go:334] "Generic (PLEG): container finished" podID="a311c60b-e9c5-4ba5-a913-d1ef5db7ff47" containerID="2f73abaacb913ab77c9be85b16c880e6d88e88f1a7143048f637db53ba9eb8b9" exitCode=0 Oct 05 21:21:28 crc kubenswrapper[4753]: I1005 21:21:28.924462 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn44x" event={"ID":"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47","Type":"ContainerDied","Data":"2f73abaacb913ab77c9be85b16c880e6d88e88f1a7143048f637db53ba9eb8b9"} Oct 05 21:21:28 crc kubenswrapper[4753]: I1005 21:21:28.924492 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn44x" event={"ID":"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47","Type":"ContainerStarted","Data":"127dbdd95af6367efc1a77e4f095b71c02972666e8ab6249fdf2a34d08213335"} Oct 05 21:21:29 crc kubenswrapper[4753]: I1005 21:21:29.934021 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn44x" event={"ID":"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47","Type":"ContainerStarted","Data":"46ae1db0289a390a580adec191ede414d498463e3d3c7e4c32f82c90099315f2"} Oct 05 21:21:30 crc kubenswrapper[4753]: I1005 21:21:30.815763 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v7fpp"] Oct 05 21:21:30 crc kubenswrapper[4753]: I1005 21:21:30.818869 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:30 crc kubenswrapper[4753]: I1005 21:21:30.829196 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v7fpp"] Oct 05 21:21:30 crc kubenswrapper[4753]: I1005 21:21:30.880078 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a9c7cba-7b98-42f1-b436-334fa4f06574-catalog-content\") pod \"redhat-marketplace-v7fpp\" (UID: \"2a9c7cba-7b98-42f1-b436-334fa4f06574\") " pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:30 crc kubenswrapper[4753]: I1005 21:21:30.880256 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a9c7cba-7b98-42f1-b436-334fa4f06574-utilities\") pod \"redhat-marketplace-v7fpp\" (UID: \"2a9c7cba-7b98-42f1-b436-334fa4f06574\") " pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:30 crc kubenswrapper[4753]: I1005 21:21:30.880555 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsb7p\" (UniqueName: \"kubernetes.io/projected/2a9c7cba-7b98-42f1-b436-334fa4f06574-kube-api-access-xsb7p\") pod \"redhat-marketplace-v7fpp\" (UID: \"2a9c7cba-7b98-42f1-b436-334fa4f06574\") " pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:30 crc kubenswrapper[4753]: I1005 21:21:30.981879 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsb7p\" (UniqueName: \"kubernetes.io/projected/2a9c7cba-7b98-42f1-b436-334fa4f06574-kube-api-access-xsb7p\") pod \"redhat-marketplace-v7fpp\" (UID: \"2a9c7cba-7b98-42f1-b436-334fa4f06574\") " pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:30 crc kubenswrapper[4753]: I1005 21:21:30.981928 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a9c7cba-7b98-42f1-b436-334fa4f06574-catalog-content\") pod \"redhat-marketplace-v7fpp\" (UID: \"2a9c7cba-7b98-42f1-b436-334fa4f06574\") " pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:30 crc kubenswrapper[4753]: I1005 21:21:30.982030 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a9c7cba-7b98-42f1-b436-334fa4f06574-utilities\") pod \"redhat-marketplace-v7fpp\" (UID: \"2a9c7cba-7b98-42f1-b436-334fa4f06574\") " pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:30 crc kubenswrapper[4753]: I1005 21:21:30.982447 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a9c7cba-7b98-42f1-b436-334fa4f06574-catalog-content\") pod \"redhat-marketplace-v7fpp\" (UID: \"2a9c7cba-7b98-42f1-b436-334fa4f06574\") " pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:30 crc kubenswrapper[4753]: I1005 21:21:30.982652 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a9c7cba-7b98-42f1-b436-334fa4f06574-utilities\") pod \"redhat-marketplace-v7fpp\" (UID: \"2a9c7cba-7b98-42f1-b436-334fa4f06574\") " pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:31 crc kubenswrapper[4753]: I1005 21:21:31.009127 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsb7p\" (UniqueName: \"kubernetes.io/projected/2a9c7cba-7b98-42f1-b436-334fa4f06574-kube-api-access-xsb7p\") pod \"redhat-marketplace-v7fpp\" (UID: \"2a9c7cba-7b98-42f1-b436-334fa4f06574\") " pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:31 crc kubenswrapper[4753]: I1005 21:21:31.139028 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:31 crc kubenswrapper[4753]: I1005 21:21:31.658165 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v7fpp"] Oct 05 21:21:31 crc kubenswrapper[4753]: W1005 21:21:31.664167 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a9c7cba_7b98_42f1_b436_334fa4f06574.slice/crio-6750cdd1580e210cd66d2564adeb8fc8321afeb6e2f1bfef387cb864bb9a4652 WatchSource:0}: Error finding container 6750cdd1580e210cd66d2564adeb8fc8321afeb6e2f1bfef387cb864bb9a4652: Status 404 returned error can't find the container with id 6750cdd1580e210cd66d2564adeb8fc8321afeb6e2f1bfef387cb864bb9a4652 Oct 05 21:21:31 crc kubenswrapper[4753]: I1005 21:21:31.953824 4753 generic.go:334] "Generic (PLEG): container finished" podID="a311c60b-e9c5-4ba5-a913-d1ef5db7ff47" containerID="46ae1db0289a390a580adec191ede414d498463e3d3c7e4c32f82c90099315f2" exitCode=0 Oct 05 21:21:31 crc kubenswrapper[4753]: I1005 21:21:31.953898 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn44x" event={"ID":"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47","Type":"ContainerDied","Data":"46ae1db0289a390a580adec191ede414d498463e3d3c7e4c32f82c90099315f2"} Oct 05 21:21:31 crc kubenswrapper[4753]: I1005 21:21:31.955888 4753 generic.go:334] "Generic (PLEG): container finished" podID="2a9c7cba-7b98-42f1-b436-334fa4f06574" containerID="2ee65d62eaaf9b52d2c8146fe83824b7c08cc2726d4bb57c9532788135dcd62a" exitCode=0 Oct 05 21:21:31 crc kubenswrapper[4753]: I1005 21:21:31.955919 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7fpp" event={"ID":"2a9c7cba-7b98-42f1-b436-334fa4f06574","Type":"ContainerDied","Data":"2ee65d62eaaf9b52d2c8146fe83824b7c08cc2726d4bb57c9532788135dcd62a"} Oct 05 21:21:31 crc kubenswrapper[4753]: I1005 21:21:31.955938 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7fpp" event={"ID":"2a9c7cba-7b98-42f1-b436-334fa4f06574","Type":"ContainerStarted","Data":"6750cdd1580e210cd66d2564adeb8fc8321afeb6e2f1bfef387cb864bb9a4652"} Oct 05 21:21:32 crc kubenswrapper[4753]: I1005 21:21:32.222473 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k4gxz"] Oct 05 21:21:32 crc kubenswrapper[4753]: I1005 21:21:32.239237 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k4gxz"] Oct 05 21:21:32 crc kubenswrapper[4753]: I1005 21:21:32.239333 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:21:32 crc kubenswrapper[4753]: I1005 21:21:32.309692 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/914b9197-febb-4aa8-a221-53b0009d2a7b-utilities\") pod \"redhat-operators-k4gxz\" (UID: \"914b9197-febb-4aa8-a221-53b0009d2a7b\") " pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:21:32 crc kubenswrapper[4753]: I1005 21:21:32.309732 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/914b9197-febb-4aa8-a221-53b0009d2a7b-catalog-content\") pod \"redhat-operators-k4gxz\" (UID: \"914b9197-febb-4aa8-a221-53b0009d2a7b\") " pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:21:32 crc kubenswrapper[4753]: I1005 21:21:32.309783 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frscv\" (UniqueName: \"kubernetes.io/projected/914b9197-febb-4aa8-a221-53b0009d2a7b-kube-api-access-frscv\") pod \"redhat-operators-k4gxz\" (UID: \"914b9197-febb-4aa8-a221-53b0009d2a7b\") " pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:21:32 crc kubenswrapper[4753]: I1005 21:21:32.412416 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/914b9197-febb-4aa8-a221-53b0009d2a7b-utilities\") pod \"redhat-operators-k4gxz\" (UID: \"914b9197-febb-4aa8-a221-53b0009d2a7b\") " pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:21:32 crc kubenswrapper[4753]: I1005 21:21:32.412462 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/914b9197-febb-4aa8-a221-53b0009d2a7b-catalog-content\") pod \"redhat-operators-k4gxz\" (UID: \"914b9197-febb-4aa8-a221-53b0009d2a7b\") " pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:21:32 crc kubenswrapper[4753]: I1005 21:21:32.412517 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frscv\" (UniqueName: \"kubernetes.io/projected/914b9197-febb-4aa8-a221-53b0009d2a7b-kube-api-access-frscv\") pod \"redhat-operators-k4gxz\" (UID: \"914b9197-febb-4aa8-a221-53b0009d2a7b\") " pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:21:32 crc kubenswrapper[4753]: I1005 21:21:32.412869 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/914b9197-febb-4aa8-a221-53b0009d2a7b-utilities\") pod \"redhat-operators-k4gxz\" (UID: \"914b9197-febb-4aa8-a221-53b0009d2a7b\") " pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:21:32 crc kubenswrapper[4753]: I1005 21:21:32.412909 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/914b9197-febb-4aa8-a221-53b0009d2a7b-catalog-content\") pod \"redhat-operators-k4gxz\" (UID: \"914b9197-febb-4aa8-a221-53b0009d2a7b\") " pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:21:32 crc kubenswrapper[4753]: I1005 21:21:32.433190 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frscv\" (UniqueName: \"kubernetes.io/projected/914b9197-febb-4aa8-a221-53b0009d2a7b-kube-api-access-frscv\") pod \"redhat-operators-k4gxz\" (UID: \"914b9197-febb-4aa8-a221-53b0009d2a7b\") " pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:21:32 crc kubenswrapper[4753]: I1005 21:21:32.563601 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:21:33 crc kubenswrapper[4753]: I1005 21:21:33.018418 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn44x" event={"ID":"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47","Type":"ContainerStarted","Data":"763cb9dfc1280c91f5a6e2e22fdbcc89428b0d3d9f9d56b78c7440dc7c1431ad"} Oct 05 21:21:33 crc kubenswrapper[4753]: I1005 21:21:33.063284 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nn44x" podStartSLOduration=2.519618153 podStartE2EDuration="6.063261421s" podCreationTimestamp="2025-10-05 21:21:27 +0000 UTC" firstStartedPulling="2025-10-05 21:21:28.926718284 +0000 UTC m=+3997.775046516" lastFinishedPulling="2025-10-05 21:21:32.470361552 +0000 UTC m=+4001.318689784" observedRunningTime="2025-10-05 21:21:33.053221153 +0000 UTC m=+4001.901549385" watchObservedRunningTime="2025-10-05 21:21:33.063261421 +0000 UTC m=+4001.911589653" Oct 05 21:21:33 crc kubenswrapper[4753]: I1005 21:21:33.207798 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k4gxz"] Oct 05 21:21:34 crc kubenswrapper[4753]: I1005 21:21:34.029648 4753 generic.go:334] "Generic (PLEG): container finished" podID="2a9c7cba-7b98-42f1-b436-334fa4f06574" containerID="421a75d7b8d2956abee7e087a3478e0c2db295afbc1fe1d5e8c17095478fec4f" exitCode=0 Oct 05 21:21:34 crc kubenswrapper[4753]: I1005 21:21:34.029790 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7fpp" event={"ID":"2a9c7cba-7b98-42f1-b436-334fa4f06574","Type":"ContainerDied","Data":"421a75d7b8d2956abee7e087a3478e0c2db295afbc1fe1d5e8c17095478fec4f"} Oct 05 21:21:34 crc kubenswrapper[4753]: I1005 21:21:34.036832 4753 generic.go:334] "Generic (PLEG): container finished" podID="914b9197-febb-4aa8-a221-53b0009d2a7b" containerID="2a003fc2d292754745262fa237d04def24126a1a2aec93f5075856e51033bfa8" exitCode=0 Oct 05 21:21:34 crc kubenswrapper[4753]: I1005 21:21:34.036877 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gxz" event={"ID":"914b9197-febb-4aa8-a221-53b0009d2a7b","Type":"ContainerDied","Data":"2a003fc2d292754745262fa237d04def24126a1a2aec93f5075856e51033bfa8"} Oct 05 21:21:34 crc kubenswrapper[4753]: I1005 21:21:34.036957 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gxz" event={"ID":"914b9197-febb-4aa8-a221-53b0009d2a7b","Type":"ContainerStarted","Data":"0739dc45eec584f7cdd370bb0cd5f6d56b719c2db0ef0669e0a24304939cc6f6"} Oct 05 21:21:35 crc kubenswrapper[4753]: I1005 21:21:35.047269 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7fpp" event={"ID":"2a9c7cba-7b98-42f1-b436-334fa4f06574","Type":"ContainerStarted","Data":"3850a0a21da018d017682f11a467bbdac2ae6ccdd20cb80563a7dfa0a63ad791"} Oct 05 21:21:35 crc kubenswrapper[4753]: I1005 21:21:35.049562 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gxz" event={"ID":"914b9197-febb-4aa8-a221-53b0009d2a7b","Type":"ContainerStarted","Data":"70a6752db98c853152f14be12f225adb5d7de28100e5a5f5c7cfc45ddc5e5039"} Oct 05 21:21:35 crc kubenswrapper[4753]: I1005 21:21:35.095069 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v7fpp" podStartSLOduration=2.47580642 podStartE2EDuration="5.095052749s" podCreationTimestamp="2025-10-05 21:21:30 +0000 UTC" firstStartedPulling="2025-10-05 21:21:31.957860891 +0000 UTC m=+4000.806189123" lastFinishedPulling="2025-10-05 21:21:34.57710723 +0000 UTC m=+4003.425435452" observedRunningTime="2025-10-05 21:21:35.071769906 +0000 UTC m=+4003.920098138" watchObservedRunningTime="2025-10-05 21:21:35.095052749 +0000 UTC m=+4003.943380981" Oct 05 21:21:37 crc kubenswrapper[4753]: I1005 21:21:37.732921 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:37 crc kubenswrapper[4753]: I1005 21:21:37.733484 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:37 crc kubenswrapper[4753]: I1005 21:21:37.866700 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:38 crc kubenswrapper[4753]: I1005 21:21:38.133191 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:39 crc kubenswrapper[4753]: I1005 21:21:39.097919 4753 generic.go:334] "Generic (PLEG): container finished" podID="914b9197-febb-4aa8-a221-53b0009d2a7b" containerID="70a6752db98c853152f14be12f225adb5d7de28100e5a5f5c7cfc45ddc5e5039" exitCode=0 Oct 05 21:21:39 crc kubenswrapper[4753]: I1005 21:21:39.099675 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gxz" event={"ID":"914b9197-febb-4aa8-a221-53b0009d2a7b","Type":"ContainerDied","Data":"70a6752db98c853152f14be12f225adb5d7de28100e5a5f5c7cfc45ddc5e5039"} Oct 05 21:21:40 crc kubenswrapper[4753]: I1005 21:21:40.110324 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gxz" event={"ID":"914b9197-febb-4aa8-a221-53b0009d2a7b","Type":"ContainerStarted","Data":"c0f85eef4a3af73cbed89d12b8cbd1b873ff3ae93bdabe51f1f3b963c31c35bd"} Oct 05 21:21:40 crc kubenswrapper[4753]: I1005 21:21:40.133875 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k4gxz" podStartSLOduration=2.45843128 podStartE2EDuration="8.133855085s" podCreationTimestamp="2025-10-05 21:21:32 +0000 UTC" firstStartedPulling="2025-10-05 21:21:34.038317372 +0000 UTC m=+4002.886645604" lastFinishedPulling="2025-10-05 21:21:39.713741167 +0000 UTC m=+4008.562069409" observedRunningTime="2025-10-05 21:21:40.127062966 +0000 UTC m=+4008.975391208" watchObservedRunningTime="2025-10-05 21:21:40.133855085 +0000 UTC m=+4008.982183317" Oct 05 21:21:41 crc kubenswrapper[4753]: I1005 21:21:41.139187 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:41 crc kubenswrapper[4753]: I1005 21:21:41.139455 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:41 crc kubenswrapper[4753]: I1005 21:21:41.206628 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:42 crc kubenswrapper[4753]: I1005 21:21:42.172926 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:42 crc kubenswrapper[4753]: I1005 21:21:42.564755 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:21:42 crc kubenswrapper[4753]: I1005 21:21:42.565100 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:21:43 crc kubenswrapper[4753]: I1005 21:21:43.401214 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nn44x"] Oct 05 21:21:43 crc kubenswrapper[4753]: I1005 21:21:43.401458 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nn44x" podUID="a311c60b-e9c5-4ba5-a913-d1ef5db7ff47" containerName="registry-server" containerID="cri-o://763cb9dfc1280c91f5a6e2e22fdbcc89428b0d3d9f9d56b78c7440dc7c1431ad" gracePeriod=2 Oct 05 21:21:43 crc kubenswrapper[4753]: I1005 21:21:43.610268 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k4gxz" podUID="914b9197-febb-4aa8-a221-53b0009d2a7b" containerName="registry-server" probeResult="failure" output=< Oct 05 21:21:43 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 21:21:43 crc kubenswrapper[4753]: > Oct 05 21:21:43 crc kubenswrapper[4753]: I1005 21:21:43.993683 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.058859 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-utilities\") pod \"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47\" (UID: \"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47\") " Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.058908 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-catalog-content\") pod \"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47\" (UID: \"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47\") " Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.059015 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwjlv\" (UniqueName: \"kubernetes.io/projected/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-kube-api-access-xwjlv\") pod \"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47\" (UID: \"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47\") " Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.064170 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-utilities" (OuterVolumeSpecName: "utilities") pod "a311c60b-e9c5-4ba5-a913-d1ef5db7ff47" (UID: "a311c60b-e9c5-4ba5-a913-d1ef5db7ff47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.086774 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-kube-api-access-xwjlv" (OuterVolumeSpecName: "kube-api-access-xwjlv") pod "a311c60b-e9c5-4ba5-a913-d1ef5db7ff47" (UID: "a311c60b-e9c5-4ba5-a913-d1ef5db7ff47"). InnerVolumeSpecName "kube-api-access-xwjlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.110875 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a311c60b-e9c5-4ba5-a913-d1ef5db7ff47" (UID: "a311c60b-e9c5-4ba5-a913-d1ef5db7ff47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.144452 4753 generic.go:334] "Generic (PLEG): container finished" podID="a311c60b-e9c5-4ba5-a913-d1ef5db7ff47" containerID="763cb9dfc1280c91f5a6e2e22fdbcc89428b0d3d9f9d56b78c7440dc7c1431ad" exitCode=0 Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.144508 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn44x" event={"ID":"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47","Type":"ContainerDied","Data":"763cb9dfc1280c91f5a6e2e22fdbcc89428b0d3d9f9d56b78c7440dc7c1431ad"} Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.144819 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn44x" event={"ID":"a311c60b-e9c5-4ba5-a913-d1ef5db7ff47","Type":"ContainerDied","Data":"127dbdd95af6367efc1a77e4f095b71c02972666e8ab6249fdf2a34d08213335"} Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.144844 4753 scope.go:117] "RemoveContainer" containerID="763cb9dfc1280c91f5a6e2e22fdbcc89428b0d3d9f9d56b78c7440dc7c1431ad" Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.144561 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nn44x" Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.163406 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.163436 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.163446 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwjlv\" (UniqueName: \"kubernetes.io/projected/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47-kube-api-access-xwjlv\") on node \"crc\" DevicePath \"\"" Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.178794 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nn44x"] Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.179644 4753 scope.go:117] "RemoveContainer" containerID="46ae1db0289a390a580adec191ede414d498463e3d3c7e4c32f82c90099315f2" Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.189834 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nn44x"] Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.214806 4753 scope.go:117] "RemoveContainer" containerID="2f73abaacb913ab77c9be85b16c880e6d88e88f1a7143048f637db53ba9eb8b9" Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.254616 4753 scope.go:117] "RemoveContainer" containerID="763cb9dfc1280c91f5a6e2e22fdbcc89428b0d3d9f9d56b78c7440dc7c1431ad" Oct 05 21:21:44 crc kubenswrapper[4753]: E1005 21:21:44.255390 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"763cb9dfc1280c91f5a6e2e22fdbcc89428b0d3d9f9d56b78c7440dc7c1431ad\": container with ID starting with 763cb9dfc1280c91f5a6e2e22fdbcc89428b0d3d9f9d56b78c7440dc7c1431ad not found: ID does not exist" containerID="763cb9dfc1280c91f5a6e2e22fdbcc89428b0d3d9f9d56b78c7440dc7c1431ad" Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.255443 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"763cb9dfc1280c91f5a6e2e22fdbcc89428b0d3d9f9d56b78c7440dc7c1431ad"} err="failed to get container status \"763cb9dfc1280c91f5a6e2e22fdbcc89428b0d3d9f9d56b78c7440dc7c1431ad\": rpc error: code = NotFound desc = could not find container \"763cb9dfc1280c91f5a6e2e22fdbcc89428b0d3d9f9d56b78c7440dc7c1431ad\": container with ID starting with 763cb9dfc1280c91f5a6e2e22fdbcc89428b0d3d9f9d56b78c7440dc7c1431ad not found: ID does not exist" Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.255473 4753 scope.go:117] "RemoveContainer" containerID="46ae1db0289a390a580adec191ede414d498463e3d3c7e4c32f82c90099315f2" Oct 05 21:21:44 crc kubenswrapper[4753]: E1005 21:21:44.256373 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46ae1db0289a390a580adec191ede414d498463e3d3c7e4c32f82c90099315f2\": container with ID starting with 46ae1db0289a390a580adec191ede414d498463e3d3c7e4c32f82c90099315f2 not found: ID does not exist" containerID="46ae1db0289a390a580adec191ede414d498463e3d3c7e4c32f82c90099315f2" Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.256524 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46ae1db0289a390a580adec191ede414d498463e3d3c7e4c32f82c90099315f2"} err="failed to get container status \"46ae1db0289a390a580adec191ede414d498463e3d3c7e4c32f82c90099315f2\": rpc error: code = NotFound desc = could not find container \"46ae1db0289a390a580adec191ede414d498463e3d3c7e4c32f82c90099315f2\": container with ID starting with 46ae1db0289a390a580adec191ede414d498463e3d3c7e4c32f82c90099315f2 not found: ID does not exist" Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.256645 4753 scope.go:117] "RemoveContainer" containerID="2f73abaacb913ab77c9be85b16c880e6d88e88f1a7143048f637db53ba9eb8b9" Oct 05 21:21:44 crc kubenswrapper[4753]: E1005 21:21:44.257258 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f73abaacb913ab77c9be85b16c880e6d88e88f1a7143048f637db53ba9eb8b9\": container with ID starting with 2f73abaacb913ab77c9be85b16c880e6d88e88f1a7143048f637db53ba9eb8b9 not found: ID does not exist" containerID="2f73abaacb913ab77c9be85b16c880e6d88e88f1a7143048f637db53ba9eb8b9" Oct 05 21:21:44 crc kubenswrapper[4753]: I1005 21:21:44.257291 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f73abaacb913ab77c9be85b16c880e6d88e88f1a7143048f637db53ba9eb8b9"} err="failed to get container status \"2f73abaacb913ab77c9be85b16c880e6d88e88f1a7143048f637db53ba9eb8b9\": rpc error: code = NotFound desc = could not find container \"2f73abaacb913ab77c9be85b16c880e6d88e88f1a7143048f637db53ba9eb8b9\": container with ID starting with 2f73abaacb913ab77c9be85b16c880e6d88e88f1a7143048f637db53ba9eb8b9 not found: ID does not exist" Oct 05 21:21:45 crc kubenswrapper[4753]: I1005 21:21:45.604320 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v7fpp"] Oct 05 21:21:45 crc kubenswrapper[4753]: I1005 21:21:45.605547 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v7fpp" podUID="2a9c7cba-7b98-42f1-b436-334fa4f06574" containerName="registry-server" containerID="cri-o://3850a0a21da018d017682f11a467bbdac2ae6ccdd20cb80563a7dfa0a63ad791" gracePeriod=2 Oct 05 21:21:45 crc kubenswrapper[4753]: I1005 21:21:45.863050 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a311c60b-e9c5-4ba5-a913-d1ef5db7ff47" path="/var/lib/kubelet/pods/a311c60b-e9c5-4ba5-a913-d1ef5db7ff47/volumes" Oct 05 21:21:46 crc kubenswrapper[4753]: I1005 21:21:46.164372 4753 generic.go:334] "Generic (PLEG): container finished" podID="2a9c7cba-7b98-42f1-b436-334fa4f06574" containerID="3850a0a21da018d017682f11a467bbdac2ae6ccdd20cb80563a7dfa0a63ad791" exitCode=0 Oct 05 21:21:46 crc kubenswrapper[4753]: I1005 21:21:46.164499 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7fpp" event={"ID":"2a9c7cba-7b98-42f1-b436-334fa4f06574","Type":"ContainerDied","Data":"3850a0a21da018d017682f11a467bbdac2ae6ccdd20cb80563a7dfa0a63ad791"} Oct 05 21:21:46 crc kubenswrapper[4753]: I1005 21:21:46.164643 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v7fpp" event={"ID":"2a9c7cba-7b98-42f1-b436-334fa4f06574","Type":"ContainerDied","Data":"6750cdd1580e210cd66d2564adeb8fc8321afeb6e2f1bfef387cb864bb9a4652"} Oct 05 21:21:46 crc kubenswrapper[4753]: I1005 21:21:46.164654 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6750cdd1580e210cd66d2564adeb8fc8321afeb6e2f1bfef387cb864bb9a4652" Oct 05 21:21:46 crc kubenswrapper[4753]: I1005 21:21:46.188425 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:46 crc kubenswrapper[4753]: I1005 21:21:46.349475 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a9c7cba-7b98-42f1-b436-334fa4f06574-catalog-content\") pod \"2a9c7cba-7b98-42f1-b436-334fa4f06574\" (UID: \"2a9c7cba-7b98-42f1-b436-334fa4f06574\") " Oct 05 21:21:46 crc kubenswrapper[4753]: I1005 21:21:46.349539 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsb7p\" (UniqueName: \"kubernetes.io/projected/2a9c7cba-7b98-42f1-b436-334fa4f06574-kube-api-access-xsb7p\") pod \"2a9c7cba-7b98-42f1-b436-334fa4f06574\" (UID: \"2a9c7cba-7b98-42f1-b436-334fa4f06574\") " Oct 05 21:21:46 crc kubenswrapper[4753]: I1005 21:21:46.349625 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a9c7cba-7b98-42f1-b436-334fa4f06574-utilities\") pod \"2a9c7cba-7b98-42f1-b436-334fa4f06574\" (UID: \"2a9c7cba-7b98-42f1-b436-334fa4f06574\") " Oct 05 21:21:46 crc kubenswrapper[4753]: I1005 21:21:46.350576 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a9c7cba-7b98-42f1-b436-334fa4f06574-utilities" (OuterVolumeSpecName: "utilities") pod "2a9c7cba-7b98-42f1-b436-334fa4f06574" (UID: "2a9c7cba-7b98-42f1-b436-334fa4f06574"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:21:46 crc kubenswrapper[4753]: I1005 21:21:46.360357 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a9c7cba-7b98-42f1-b436-334fa4f06574-kube-api-access-xsb7p" (OuterVolumeSpecName: "kube-api-access-xsb7p") pod "2a9c7cba-7b98-42f1-b436-334fa4f06574" (UID: "2a9c7cba-7b98-42f1-b436-334fa4f06574"). InnerVolumeSpecName "kube-api-access-xsb7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:21:46 crc kubenswrapper[4753]: I1005 21:21:46.364292 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a9c7cba-7b98-42f1-b436-334fa4f06574-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a9c7cba-7b98-42f1-b436-334fa4f06574" (UID: "2a9c7cba-7b98-42f1-b436-334fa4f06574"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:21:46 crc kubenswrapper[4753]: I1005 21:21:46.451849 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a9c7cba-7b98-42f1-b436-334fa4f06574-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 21:21:46 crc kubenswrapper[4753]: I1005 21:21:46.451883 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsb7p\" (UniqueName: \"kubernetes.io/projected/2a9c7cba-7b98-42f1-b436-334fa4f06574-kube-api-access-xsb7p\") on node \"crc\" DevicePath \"\"" Oct 05 21:21:46 crc kubenswrapper[4753]: I1005 21:21:46.451892 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a9c7cba-7b98-42f1-b436-334fa4f06574-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 21:21:47 crc kubenswrapper[4753]: I1005 21:21:47.172536 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v7fpp" Oct 05 21:21:47 crc kubenswrapper[4753]: I1005 21:21:47.205954 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v7fpp"] Oct 05 21:21:47 crc kubenswrapper[4753]: I1005 21:21:47.213830 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v7fpp"] Oct 05 21:21:47 crc kubenswrapper[4753]: I1005 21:21:47.863585 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a9c7cba-7b98-42f1-b436-334fa4f06574" path="/var/lib/kubelet/pods/2a9c7cba-7b98-42f1-b436-334fa4f06574/volumes" Oct 05 21:21:53 crc kubenswrapper[4753]: I1005 21:21:53.609446 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k4gxz" podUID="914b9197-febb-4aa8-a221-53b0009d2a7b" containerName="registry-server" probeResult="failure" output=< Oct 05 21:21:53 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 21:21:53 crc kubenswrapper[4753]: > Oct 05 21:22:02 crc kubenswrapper[4753]: I1005 21:22:02.611813 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:22:02 crc kubenswrapper[4753]: I1005 21:22:02.662439 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:22:02 crc kubenswrapper[4753]: I1005 21:22:02.847612 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k4gxz"] Oct 05 21:22:04 crc kubenswrapper[4753]: I1005 21:22:04.324537 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k4gxz" podUID="914b9197-febb-4aa8-a221-53b0009d2a7b" containerName="registry-server" containerID="cri-o://c0f85eef4a3af73cbed89d12b8cbd1b873ff3ae93bdabe51f1f3b963c31c35bd" gracePeriod=2 Oct 05 21:22:04 crc kubenswrapper[4753]: I1005 21:22:04.810835 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:22:04 crc kubenswrapper[4753]: I1005 21:22:04.872164 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/914b9197-febb-4aa8-a221-53b0009d2a7b-catalog-content\") pod \"914b9197-febb-4aa8-a221-53b0009d2a7b\" (UID: \"914b9197-febb-4aa8-a221-53b0009d2a7b\") " Oct 05 21:22:04 crc kubenswrapper[4753]: I1005 21:22:04.872349 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/914b9197-febb-4aa8-a221-53b0009d2a7b-utilities\") pod \"914b9197-febb-4aa8-a221-53b0009d2a7b\" (UID: \"914b9197-febb-4aa8-a221-53b0009d2a7b\") " Oct 05 21:22:04 crc kubenswrapper[4753]: I1005 21:22:04.872390 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frscv\" (UniqueName: \"kubernetes.io/projected/914b9197-febb-4aa8-a221-53b0009d2a7b-kube-api-access-frscv\") pod \"914b9197-febb-4aa8-a221-53b0009d2a7b\" (UID: \"914b9197-febb-4aa8-a221-53b0009d2a7b\") " Oct 05 21:22:04 crc kubenswrapper[4753]: I1005 21:22:04.873276 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/914b9197-febb-4aa8-a221-53b0009d2a7b-utilities" (OuterVolumeSpecName: "utilities") pod "914b9197-febb-4aa8-a221-53b0009d2a7b" (UID: "914b9197-febb-4aa8-a221-53b0009d2a7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:22:04 crc kubenswrapper[4753]: I1005 21:22:04.879367 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/914b9197-febb-4aa8-a221-53b0009d2a7b-kube-api-access-frscv" (OuterVolumeSpecName: "kube-api-access-frscv") pod "914b9197-febb-4aa8-a221-53b0009d2a7b" (UID: "914b9197-febb-4aa8-a221-53b0009d2a7b"). InnerVolumeSpecName "kube-api-access-frscv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:22:04 crc kubenswrapper[4753]: I1005 21:22:04.956538 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/914b9197-febb-4aa8-a221-53b0009d2a7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "914b9197-febb-4aa8-a221-53b0009d2a7b" (UID: "914b9197-febb-4aa8-a221-53b0009d2a7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:22:04 crc kubenswrapper[4753]: I1005 21:22:04.974662 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/914b9197-febb-4aa8-a221-53b0009d2a7b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 21:22:04 crc kubenswrapper[4753]: I1005 21:22:04.974693 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/914b9197-febb-4aa8-a221-53b0009d2a7b-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 21:22:04 crc kubenswrapper[4753]: I1005 21:22:04.974703 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frscv\" (UniqueName: \"kubernetes.io/projected/914b9197-febb-4aa8-a221-53b0009d2a7b-kube-api-access-frscv\") on node \"crc\" DevicePath \"\"" Oct 05 21:22:05 crc kubenswrapper[4753]: I1005 21:22:05.338898 4753 generic.go:334] "Generic (PLEG): container finished" podID="914b9197-febb-4aa8-a221-53b0009d2a7b" containerID="c0f85eef4a3af73cbed89d12b8cbd1b873ff3ae93bdabe51f1f3b963c31c35bd" exitCode=0 Oct 05 21:22:05 crc kubenswrapper[4753]: I1005 21:22:05.338945 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gxz" event={"ID":"914b9197-febb-4aa8-a221-53b0009d2a7b","Type":"ContainerDied","Data":"c0f85eef4a3af73cbed89d12b8cbd1b873ff3ae93bdabe51f1f3b963c31c35bd"} Oct 05 21:22:05 crc kubenswrapper[4753]: I1005 21:22:05.338977 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gxz" event={"ID":"914b9197-febb-4aa8-a221-53b0009d2a7b","Type":"ContainerDied","Data":"0739dc45eec584f7cdd370bb0cd5f6d56b719c2db0ef0669e0a24304939cc6f6"} Oct 05 21:22:05 crc kubenswrapper[4753]: I1005 21:22:05.338998 4753 scope.go:117] "RemoveContainer" containerID="c0f85eef4a3af73cbed89d12b8cbd1b873ff3ae93bdabe51f1f3b963c31c35bd" Oct 05 21:22:05 crc kubenswrapper[4753]: I1005 21:22:05.339075 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4gxz" Oct 05 21:22:05 crc kubenswrapper[4753]: I1005 21:22:05.380372 4753 scope.go:117] "RemoveContainer" containerID="70a6752db98c853152f14be12f225adb5d7de28100e5a5f5c7cfc45ddc5e5039" Oct 05 21:22:05 crc kubenswrapper[4753]: I1005 21:22:05.386184 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k4gxz"] Oct 05 21:22:05 crc kubenswrapper[4753]: I1005 21:22:05.394816 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k4gxz"] Oct 05 21:22:05 crc kubenswrapper[4753]: I1005 21:22:05.412155 4753 scope.go:117] "RemoveContainer" containerID="2a003fc2d292754745262fa237d04def24126a1a2aec93f5075856e51033bfa8" Oct 05 21:22:05 crc kubenswrapper[4753]: I1005 21:22:05.469085 4753 scope.go:117] "RemoveContainer" containerID="c0f85eef4a3af73cbed89d12b8cbd1b873ff3ae93bdabe51f1f3b963c31c35bd" Oct 05 21:22:05 crc kubenswrapper[4753]: E1005 21:22:05.472797 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0f85eef4a3af73cbed89d12b8cbd1b873ff3ae93bdabe51f1f3b963c31c35bd\": container with ID starting with c0f85eef4a3af73cbed89d12b8cbd1b873ff3ae93bdabe51f1f3b963c31c35bd not found: ID does not exist" containerID="c0f85eef4a3af73cbed89d12b8cbd1b873ff3ae93bdabe51f1f3b963c31c35bd" Oct 05 21:22:05 crc kubenswrapper[4753]: I1005 21:22:05.472834 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f85eef4a3af73cbed89d12b8cbd1b873ff3ae93bdabe51f1f3b963c31c35bd"} err="failed to get container status \"c0f85eef4a3af73cbed89d12b8cbd1b873ff3ae93bdabe51f1f3b963c31c35bd\": rpc error: code = NotFound desc = could not find container \"c0f85eef4a3af73cbed89d12b8cbd1b873ff3ae93bdabe51f1f3b963c31c35bd\": container with ID starting with c0f85eef4a3af73cbed89d12b8cbd1b873ff3ae93bdabe51f1f3b963c31c35bd not found: ID does not exist" Oct 05 21:22:05 crc kubenswrapper[4753]: I1005 21:22:05.472857 4753 scope.go:117] "RemoveContainer" containerID="70a6752db98c853152f14be12f225adb5d7de28100e5a5f5c7cfc45ddc5e5039" Oct 05 21:22:05 crc kubenswrapper[4753]: E1005 21:22:05.476408 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70a6752db98c853152f14be12f225adb5d7de28100e5a5f5c7cfc45ddc5e5039\": container with ID starting with 70a6752db98c853152f14be12f225adb5d7de28100e5a5f5c7cfc45ddc5e5039 not found: ID does not exist" containerID="70a6752db98c853152f14be12f225adb5d7de28100e5a5f5c7cfc45ddc5e5039" Oct 05 21:22:05 crc kubenswrapper[4753]: I1005 21:22:05.476442 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a6752db98c853152f14be12f225adb5d7de28100e5a5f5c7cfc45ddc5e5039"} err="failed to get container status \"70a6752db98c853152f14be12f225adb5d7de28100e5a5f5c7cfc45ddc5e5039\": rpc error: code = NotFound desc = could not find container \"70a6752db98c853152f14be12f225adb5d7de28100e5a5f5c7cfc45ddc5e5039\": container with ID starting with 70a6752db98c853152f14be12f225adb5d7de28100e5a5f5c7cfc45ddc5e5039 not found: ID does not exist" Oct 05 21:22:05 crc kubenswrapper[4753]: I1005 21:22:05.476463 4753 scope.go:117] "RemoveContainer" containerID="2a003fc2d292754745262fa237d04def24126a1a2aec93f5075856e51033bfa8" Oct 05 21:22:05 crc kubenswrapper[4753]: E1005 21:22:05.483588 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a003fc2d292754745262fa237d04def24126a1a2aec93f5075856e51033bfa8\": container with ID starting with 2a003fc2d292754745262fa237d04def24126a1a2aec93f5075856e51033bfa8 not found: ID does not exist" containerID="2a003fc2d292754745262fa237d04def24126a1a2aec93f5075856e51033bfa8" Oct 05 21:22:05 crc kubenswrapper[4753]: I1005 21:22:05.483642 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a003fc2d292754745262fa237d04def24126a1a2aec93f5075856e51033bfa8"} err="failed to get container status \"2a003fc2d292754745262fa237d04def24126a1a2aec93f5075856e51033bfa8\": rpc error: code = NotFound desc = could not find container \"2a003fc2d292754745262fa237d04def24126a1a2aec93f5075856e51033bfa8\": container with ID starting with 2a003fc2d292754745262fa237d04def24126a1a2aec93f5075856e51033bfa8 not found: ID does not exist" Oct 05 21:22:05 crc kubenswrapper[4753]: I1005 21:22:05.864611 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="914b9197-febb-4aa8-a221-53b0009d2a7b" path="/var/lib/kubelet/pods/914b9197-febb-4aa8-a221-53b0009d2a7b/volumes" Oct 05 21:23:34 crc kubenswrapper[4753]: I1005 21:23:34.490176 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:23:34 crc kubenswrapper[4753]: I1005 21:23:34.491009 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:24:04 crc kubenswrapper[4753]: I1005 21:24:04.489795 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:24:04 crc kubenswrapper[4753]: I1005 21:24:04.490453 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:24:34 crc kubenswrapper[4753]: I1005 21:24:34.490627 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:24:34 crc kubenswrapper[4753]: I1005 21:24:34.491044 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:24:34 crc kubenswrapper[4753]: I1005 21:24:34.491085 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 21:24:34 crc kubenswrapper[4753]: I1005 21:24:34.491793 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf5b3dbe67258318b8270a000eb7f71dacdc8fd82ceca6b74fc564bc02d85d55"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 21:24:34 crc kubenswrapper[4753]: I1005 21:24:34.491842 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://cf5b3dbe67258318b8270a000eb7f71dacdc8fd82ceca6b74fc564bc02d85d55" gracePeriod=600 Oct 05 21:24:34 crc kubenswrapper[4753]: I1005 21:24:34.680450 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="cf5b3dbe67258318b8270a000eb7f71dacdc8fd82ceca6b74fc564bc02d85d55" exitCode=0 Oct 05 21:24:34 crc kubenswrapper[4753]: I1005 21:24:34.680770 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"cf5b3dbe67258318b8270a000eb7f71dacdc8fd82ceca6b74fc564bc02d85d55"} Oct 05 21:24:34 crc kubenswrapper[4753]: I1005 21:24:34.680801 4753 scope.go:117] "RemoveContainer" containerID="07fd213072d6a3c97abb777f57f1baee2c6c0512c30db717a5ef0dbd15683fe8" Oct 05 21:24:35 crc kubenswrapper[4753]: I1005 21:24:35.690244 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691"} Oct 05 21:25:21 crc kubenswrapper[4753]: I1005 21:25:21.134352 4753 generic.go:334] "Generic (PLEG): container finished" podID="989178f4-ef23-49c1-88f8-10babb448a68" containerID="7f2b15978fdaced233b753f9a843ccd14297dacb0ae85b887d500fd5c08dd619" exitCode=0 Oct 05 21:25:21 crc kubenswrapper[4753]: I1005 21:25:21.135790 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"989178f4-ef23-49c1-88f8-10babb448a68","Type":"ContainerDied","Data":"7f2b15978fdaced233b753f9a843ccd14297dacb0ae85b887d500fd5c08dd619"} Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.526259 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.640396 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbh6w\" (UniqueName: \"kubernetes.io/projected/989178f4-ef23-49c1-88f8-10babb448a68-kube-api-access-rbh6w\") pod \"989178f4-ef23-49c1-88f8-10babb448a68\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.640652 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/989178f4-ef23-49c1-88f8-10babb448a68-test-operator-ephemeral-temporary\") pod \"989178f4-ef23-49c1-88f8-10babb448a68\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.640791 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/989178f4-ef23-49c1-88f8-10babb448a68-openstack-config\") pod \"989178f4-ef23-49c1-88f8-10babb448a68\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.640893 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/989178f4-ef23-49c1-88f8-10babb448a68-config-data\") pod \"989178f4-ef23-49c1-88f8-10babb448a68\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.641063 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-openstack-config-secret\") pod \"989178f4-ef23-49c1-88f8-10babb448a68\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.641222 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/989178f4-ef23-49c1-88f8-10babb448a68-test-operator-ephemeral-workdir\") pod \"989178f4-ef23-49c1-88f8-10babb448a68\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.641351 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-ca-certs\") pod \"989178f4-ef23-49c1-88f8-10babb448a68\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.641454 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"989178f4-ef23-49c1-88f8-10babb448a68\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.641576 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-ssh-key\") pod \"989178f4-ef23-49c1-88f8-10babb448a68\" (UID: \"989178f4-ef23-49c1-88f8-10babb448a68\") " Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.642114 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/989178f4-ef23-49c1-88f8-10babb448a68-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "989178f4-ef23-49c1-88f8-10babb448a68" (UID: "989178f4-ef23-49c1-88f8-10babb448a68"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.642303 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/989178f4-ef23-49c1-88f8-10babb448a68-config-data" (OuterVolumeSpecName: "config-data") pod "989178f4-ef23-49c1-88f8-10babb448a68" (UID: "989178f4-ef23-49c1-88f8-10babb448a68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.645900 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/989178f4-ef23-49c1-88f8-10babb448a68-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "989178f4-ef23-49c1-88f8-10babb448a68" (UID: "989178f4-ef23-49c1-88f8-10babb448a68"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.663168 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "989178f4-ef23-49c1-88f8-10babb448a68" (UID: "989178f4-ef23-49c1-88f8-10babb448a68"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.663273 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/989178f4-ef23-49c1-88f8-10babb448a68-kube-api-access-rbh6w" (OuterVolumeSpecName: "kube-api-access-rbh6w") pod "989178f4-ef23-49c1-88f8-10babb448a68" (UID: "989178f4-ef23-49c1-88f8-10babb448a68"). InnerVolumeSpecName "kube-api-access-rbh6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.670047 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "989178f4-ef23-49c1-88f8-10babb448a68" (UID: "989178f4-ef23-49c1-88f8-10babb448a68"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.676011 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "989178f4-ef23-49c1-88f8-10babb448a68" (UID: "989178f4-ef23-49c1-88f8-10babb448a68"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.677156 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "989178f4-ef23-49c1-88f8-10babb448a68" (UID: "989178f4-ef23-49c1-88f8-10babb448a68"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.705533 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/989178f4-ef23-49c1-88f8-10babb448a68-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "989178f4-ef23-49c1-88f8-10babb448a68" (UID: "989178f4-ef23-49c1-88f8-10babb448a68"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.743374 4753 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/989178f4-ef23-49c1-88f8-10babb448a68-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.743407 4753 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.744321 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.744340 4753 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.744350 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbh6w\" (UniqueName: \"kubernetes.io/projected/989178f4-ef23-49c1-88f8-10babb448a68-kube-api-access-rbh6w\") on node \"crc\" DevicePath \"\"" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.744360 4753 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/989178f4-ef23-49c1-88f8-10babb448a68-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.744372 4753 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/989178f4-ef23-49c1-88f8-10babb448a68-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.744382 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/989178f4-ef23-49c1-88f8-10babb448a68-config-data\") on node \"crc\" DevicePath \"\"" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.744391 4753 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/989178f4-ef23-49c1-88f8-10babb448a68-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.765681 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 05 21:25:22 crc kubenswrapper[4753]: I1005 21:25:22.845866 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 05 21:25:23 crc kubenswrapper[4753]: I1005 21:25:23.153037 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"989178f4-ef23-49c1-88f8-10babb448a68","Type":"ContainerDied","Data":"bad8d157c8a6f1cf6cc0f6dc7107c1624dabbf2a7aad7e327280600e469c6794"} Oct 05 21:25:23 crc kubenswrapper[4753]: I1005 21:25:23.153356 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bad8d157c8a6f1cf6cc0f6dc7107c1624dabbf2a7aad7e327280600e469c6794" Oct 05 21:25:23 crc kubenswrapper[4753]: I1005 21:25:23.153232 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.155462 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 05 21:25:32 crc kubenswrapper[4753]: E1005 21:25:32.158727 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a311c60b-e9c5-4ba5-a913-d1ef5db7ff47" containerName="extract-utilities" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.158941 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a311c60b-e9c5-4ba5-a913-d1ef5db7ff47" containerName="extract-utilities" Oct 05 21:25:32 crc kubenswrapper[4753]: E1005 21:25:32.159031 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="914b9197-febb-4aa8-a221-53b0009d2a7b" containerName="registry-server" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.159107 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="914b9197-febb-4aa8-a221-53b0009d2a7b" containerName="registry-server" Oct 05 21:25:32 crc kubenswrapper[4753]: E1005 21:25:32.159222 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a311c60b-e9c5-4ba5-a913-d1ef5db7ff47" containerName="registry-server" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.159312 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a311c60b-e9c5-4ba5-a913-d1ef5db7ff47" containerName="registry-server" Oct 05 21:25:32 crc kubenswrapper[4753]: E1005 21:25:32.159731 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="914b9197-febb-4aa8-a221-53b0009d2a7b" containerName="extract-utilities" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.159818 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="914b9197-febb-4aa8-a221-53b0009d2a7b" containerName="extract-utilities" Oct 05 21:25:32 crc kubenswrapper[4753]: E1005 21:25:32.159900 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a9c7cba-7b98-42f1-b436-334fa4f06574" containerName="registry-server" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.159973 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a9c7cba-7b98-42f1-b436-334fa4f06574" containerName="registry-server" Oct 05 21:25:32 crc kubenswrapper[4753]: E1005 21:25:32.160069 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a311c60b-e9c5-4ba5-a913-d1ef5db7ff47" containerName="extract-content" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.160163 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a311c60b-e9c5-4ba5-a913-d1ef5db7ff47" containerName="extract-content" Oct 05 21:25:32 crc kubenswrapper[4753]: E1005 21:25:32.160260 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989178f4-ef23-49c1-88f8-10babb448a68" containerName="tempest-tests-tempest-tests-runner" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.160347 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="989178f4-ef23-49c1-88f8-10babb448a68" containerName="tempest-tests-tempest-tests-runner" Oct 05 21:25:32 crc kubenswrapper[4753]: E1005 21:25:32.160475 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a9c7cba-7b98-42f1-b436-334fa4f06574" containerName="extract-utilities" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.160563 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a9c7cba-7b98-42f1-b436-334fa4f06574" containerName="extract-utilities" Oct 05 21:25:32 crc kubenswrapper[4753]: E1005 21:25:32.160647 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a9c7cba-7b98-42f1-b436-334fa4f06574" containerName="extract-content" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.160724 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a9c7cba-7b98-42f1-b436-334fa4f06574" containerName="extract-content" Oct 05 21:25:32 crc kubenswrapper[4753]: E1005 21:25:32.160829 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="914b9197-febb-4aa8-a221-53b0009d2a7b" containerName="extract-content" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.160919 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="914b9197-febb-4aa8-a221-53b0009d2a7b" containerName="extract-content" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.161269 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a9c7cba-7b98-42f1-b436-334fa4f06574" containerName="registry-server" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.161383 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a311c60b-e9c5-4ba5-a913-d1ef5db7ff47" containerName="registry-server" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.161467 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="989178f4-ef23-49c1-88f8-10babb448a68" containerName="tempest-tests-tempest-tests-runner" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.161564 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="914b9197-febb-4aa8-a221-53b0009d2a7b" containerName="registry-server" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.162512 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.166524 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-9mbwd" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.179080 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.359087 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cf30bb5c-e675-411f-b50d-77dff81c83af\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.359195 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tc59\" (UniqueName: \"kubernetes.io/projected/cf30bb5c-e675-411f-b50d-77dff81c83af-kube-api-access-7tc59\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cf30bb5c-e675-411f-b50d-77dff81c83af\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.462275 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cf30bb5c-e675-411f-b50d-77dff81c83af\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.462359 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tc59\" (UniqueName: \"kubernetes.io/projected/cf30bb5c-e675-411f-b50d-77dff81c83af-kube-api-access-7tc59\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cf30bb5c-e675-411f-b50d-77dff81c83af\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.462920 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cf30bb5c-e675-411f-b50d-77dff81c83af\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.499819 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cf30bb5c-e675-411f-b50d-77dff81c83af\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.501731 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tc59\" (UniqueName: \"kubernetes.io/projected/cf30bb5c-e675-411f-b50d-77dff81c83af-kube-api-access-7tc59\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"cf30bb5c-e675-411f-b50d-77dff81c83af\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 05 21:25:32 crc kubenswrapper[4753]: I1005 21:25:32.796627 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 05 21:25:33 crc kubenswrapper[4753]: I1005 21:25:33.301474 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 05 21:25:34 crc kubenswrapper[4753]: I1005 21:25:34.283981 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"cf30bb5c-e675-411f-b50d-77dff81c83af","Type":"ContainerStarted","Data":"87dadc6fd751c8583f19cdeaf6a9dfe8278e2b71c18da8da939507869a0dceee"} Oct 05 21:25:35 crc kubenswrapper[4753]: I1005 21:25:35.297029 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"cf30bb5c-e675-411f-b50d-77dff81c83af","Type":"ContainerStarted","Data":"aec9bbd5a4782248ae96eed77420adb50d1ca5596b99d99a712b5a8f42c46921"} Oct 05 21:25:35 crc kubenswrapper[4753]: I1005 21:25:35.325077 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.199722423 podStartE2EDuration="3.325054146s" podCreationTimestamp="2025-10-05 21:25:32 +0000 UTC" firstStartedPulling="2025-10-05 21:25:33.326702113 +0000 UTC m=+4242.175030345" lastFinishedPulling="2025-10-05 21:25:34.452033836 +0000 UTC m=+4243.300362068" observedRunningTime="2025-10-05 21:25:35.313303056 +0000 UTC m=+4244.161631288" watchObservedRunningTime="2025-10-05 21:25:35.325054146 +0000 UTC m=+4244.173382388" Oct 05 21:25:52 crc kubenswrapper[4753]: I1005 21:25:52.692198 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-txk7h/must-gather-w7hs5"] Oct 05 21:25:52 crc kubenswrapper[4753]: I1005 21:25:52.694338 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-txk7h/must-gather-w7hs5" Oct 05 21:25:52 crc kubenswrapper[4753]: I1005 21:25:52.696634 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-txk7h"/"default-dockercfg-l8l2s" Oct 05 21:25:52 crc kubenswrapper[4753]: I1005 21:25:52.696718 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-txk7h"/"openshift-service-ca.crt" Oct 05 21:25:52 crc kubenswrapper[4753]: I1005 21:25:52.696963 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-txk7h"/"kube-root-ca.crt" Oct 05 21:25:52 crc kubenswrapper[4753]: I1005 21:25:52.700461 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-txk7h/must-gather-w7hs5"] Oct 05 21:25:52 crc kubenswrapper[4753]: I1005 21:25:52.837011 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9qn5\" (UniqueName: \"kubernetes.io/projected/a8f3008a-d05c-4bff-8cab-025b62c2c216-kube-api-access-x9qn5\") pod \"must-gather-w7hs5\" (UID: \"a8f3008a-d05c-4bff-8cab-025b62c2c216\") " pod="openshift-must-gather-txk7h/must-gather-w7hs5" Oct 05 21:25:52 crc kubenswrapper[4753]: I1005 21:25:52.837172 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a8f3008a-d05c-4bff-8cab-025b62c2c216-must-gather-output\") pod \"must-gather-w7hs5\" (UID: \"a8f3008a-d05c-4bff-8cab-025b62c2c216\") " pod="openshift-must-gather-txk7h/must-gather-w7hs5" Oct 05 21:25:52 crc kubenswrapper[4753]: I1005 21:25:52.938753 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9qn5\" (UniqueName: \"kubernetes.io/projected/a8f3008a-d05c-4bff-8cab-025b62c2c216-kube-api-access-x9qn5\") pod \"must-gather-w7hs5\" (UID: \"a8f3008a-d05c-4bff-8cab-025b62c2c216\") " pod="openshift-must-gather-txk7h/must-gather-w7hs5" Oct 05 21:25:52 crc kubenswrapper[4753]: I1005 21:25:52.938951 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a8f3008a-d05c-4bff-8cab-025b62c2c216-must-gather-output\") pod \"must-gather-w7hs5\" (UID: \"a8f3008a-d05c-4bff-8cab-025b62c2c216\") " pod="openshift-must-gather-txk7h/must-gather-w7hs5" Oct 05 21:25:52 crc kubenswrapper[4753]: I1005 21:25:52.940803 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a8f3008a-d05c-4bff-8cab-025b62c2c216-must-gather-output\") pod \"must-gather-w7hs5\" (UID: \"a8f3008a-d05c-4bff-8cab-025b62c2c216\") " pod="openshift-must-gather-txk7h/must-gather-w7hs5" Oct 05 21:25:52 crc kubenswrapper[4753]: I1005 21:25:52.960038 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9qn5\" (UniqueName: \"kubernetes.io/projected/a8f3008a-d05c-4bff-8cab-025b62c2c216-kube-api-access-x9qn5\") pod \"must-gather-w7hs5\" (UID: \"a8f3008a-d05c-4bff-8cab-025b62c2c216\") " pod="openshift-must-gather-txk7h/must-gather-w7hs5" Oct 05 21:25:53 crc kubenswrapper[4753]: I1005 21:25:53.016652 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-txk7h/must-gather-w7hs5" Oct 05 21:25:53 crc kubenswrapper[4753]: I1005 21:25:53.366713 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-txk7h/must-gather-w7hs5"] Oct 05 21:25:53 crc kubenswrapper[4753]: I1005 21:25:53.504963 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-txk7h/must-gather-w7hs5" event={"ID":"a8f3008a-d05c-4bff-8cab-025b62c2c216","Type":"ContainerStarted","Data":"814c4761a018d10da7815fe80926d052f3bc57738b73dfc25bdaa4263548308d"} Oct 05 21:25:59 crc kubenswrapper[4753]: I1005 21:25:59.559297 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-txk7h/must-gather-w7hs5" event={"ID":"a8f3008a-d05c-4bff-8cab-025b62c2c216","Type":"ContainerStarted","Data":"619841d507eb52f86dc132c695c90b11e4907330a7d59d58a386ec38adb934aa"} Oct 05 21:25:59 crc kubenswrapper[4753]: I1005 21:25:59.559856 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-txk7h/must-gather-w7hs5" event={"ID":"a8f3008a-d05c-4bff-8cab-025b62c2c216","Type":"ContainerStarted","Data":"8a071fed57c3098f869951e7c2b9303d68b8935b3f98ff6e40d0aa3a719fedb2"} Oct 05 21:25:59 crc kubenswrapper[4753]: I1005 21:25:59.577056 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-txk7h/must-gather-w7hs5" podStartSLOduration=2.675944853 podStartE2EDuration="7.577033552s" podCreationTimestamp="2025-10-05 21:25:52 +0000 UTC" firstStartedPulling="2025-10-05 21:25:53.358500507 +0000 UTC m=+4262.206828739" lastFinishedPulling="2025-10-05 21:25:58.259589206 +0000 UTC m=+4267.107917438" observedRunningTime="2025-10-05 21:25:59.571926954 +0000 UTC m=+4268.420255186" watchObservedRunningTime="2025-10-05 21:25:59.577033552 +0000 UTC m=+4268.425361794" Oct 05 21:26:04 crc kubenswrapper[4753]: I1005 21:26:04.174648 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-txk7h/crc-debug-4dsc6"] Oct 05 21:26:04 crc kubenswrapper[4753]: I1005 21:26:04.178864 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-txk7h/crc-debug-4dsc6" Oct 05 21:26:04 crc kubenswrapper[4753]: I1005 21:26:04.259095 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7grsf\" (UniqueName: \"kubernetes.io/projected/18f82263-d631-4623-a434-b940a6cb65c0-kube-api-access-7grsf\") pod \"crc-debug-4dsc6\" (UID: \"18f82263-d631-4623-a434-b940a6cb65c0\") " pod="openshift-must-gather-txk7h/crc-debug-4dsc6" Oct 05 21:26:04 crc kubenswrapper[4753]: I1005 21:26:04.259283 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18f82263-d631-4623-a434-b940a6cb65c0-host\") pod \"crc-debug-4dsc6\" (UID: \"18f82263-d631-4623-a434-b940a6cb65c0\") " pod="openshift-must-gather-txk7h/crc-debug-4dsc6" Oct 05 21:26:04 crc kubenswrapper[4753]: I1005 21:26:04.360922 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7grsf\" (UniqueName: \"kubernetes.io/projected/18f82263-d631-4623-a434-b940a6cb65c0-kube-api-access-7grsf\") pod \"crc-debug-4dsc6\" (UID: \"18f82263-d631-4623-a434-b940a6cb65c0\") " pod="openshift-must-gather-txk7h/crc-debug-4dsc6" Oct 05 21:26:04 crc kubenswrapper[4753]: I1005 21:26:04.360989 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18f82263-d631-4623-a434-b940a6cb65c0-host\") pod \"crc-debug-4dsc6\" (UID: \"18f82263-d631-4623-a434-b940a6cb65c0\") " pod="openshift-must-gather-txk7h/crc-debug-4dsc6" Oct 05 21:26:04 crc kubenswrapper[4753]: I1005 21:26:04.361160 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18f82263-d631-4623-a434-b940a6cb65c0-host\") pod \"crc-debug-4dsc6\" (UID: \"18f82263-d631-4623-a434-b940a6cb65c0\") " pod="openshift-must-gather-txk7h/crc-debug-4dsc6" Oct 05 21:26:04 crc kubenswrapper[4753]: I1005 21:26:04.382216 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7grsf\" (UniqueName: \"kubernetes.io/projected/18f82263-d631-4623-a434-b940a6cb65c0-kube-api-access-7grsf\") pod \"crc-debug-4dsc6\" (UID: \"18f82263-d631-4623-a434-b940a6cb65c0\") " pod="openshift-must-gather-txk7h/crc-debug-4dsc6" Oct 05 21:26:04 crc kubenswrapper[4753]: I1005 21:26:04.499994 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-txk7h/crc-debug-4dsc6" Oct 05 21:26:04 crc kubenswrapper[4753]: I1005 21:26:04.543412 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 05 21:26:04 crc kubenswrapper[4753]: I1005 21:26:04.622409 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-txk7h/crc-debug-4dsc6" event={"ID":"18f82263-d631-4623-a434-b940a6cb65c0","Type":"ContainerStarted","Data":"e3093a0bad3a92a07d132516f04a8e8c64d313b0b86840ade7878797ed87179c"} Oct 05 21:26:14 crc kubenswrapper[4753]: I1005 21:26:14.715449 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-txk7h/crc-debug-4dsc6" event={"ID":"18f82263-d631-4623-a434-b940a6cb65c0","Type":"ContainerStarted","Data":"cce6d019711360b0e1303c6875bbb315880b2d1b7fb0b6d6fcf150422bb10631"} Oct 05 21:26:14 crc kubenswrapper[4753]: I1005 21:26:14.737637 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-txk7h/crc-debug-4dsc6" podStartSLOduration=0.974675714 podStartE2EDuration="10.737618661s" podCreationTimestamp="2025-10-05 21:26:04 +0000 UTC" firstStartedPulling="2025-10-05 21:26:04.543229669 +0000 UTC m=+4273.391557901" lastFinishedPulling="2025-10-05 21:26:14.306172616 +0000 UTC m=+4283.154500848" observedRunningTime="2025-10-05 21:26:14.736688273 +0000 UTC m=+4283.585016505" watchObservedRunningTime="2025-10-05 21:26:14.737618661 +0000 UTC m=+4283.585946893" Oct 05 21:26:34 crc kubenswrapper[4753]: I1005 21:26:34.489697 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:26:34 crc kubenswrapper[4753]: I1005 21:26:34.490343 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:27:04 crc kubenswrapper[4753]: I1005 21:27:04.490039 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:27:04 crc kubenswrapper[4753]: I1005 21:27:04.490519 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:27:34 crc kubenswrapper[4753]: I1005 21:27:34.490517 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:27:34 crc kubenswrapper[4753]: I1005 21:27:34.491295 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:27:34 crc kubenswrapper[4753]: I1005 21:27:34.491369 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 21:27:34 crc kubenswrapper[4753]: I1005 21:27:34.492370 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 21:27:34 crc kubenswrapper[4753]: I1005 21:27:34.492445 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" gracePeriod=600 Oct 05 21:27:34 crc kubenswrapper[4753]: E1005 21:27:34.629933 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:27:35 crc kubenswrapper[4753]: I1005 21:27:35.432693 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" exitCode=0 Oct 05 21:27:35 crc kubenswrapper[4753]: I1005 21:27:35.432750 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691"} Oct 05 21:27:35 crc kubenswrapper[4753]: I1005 21:27:35.433175 4753 scope.go:117] "RemoveContainer" containerID="cf5b3dbe67258318b8270a000eb7f71dacdc8fd82ceca6b74fc564bc02d85d55" Oct 05 21:27:35 crc kubenswrapper[4753]: I1005 21:27:35.433749 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:27:35 crc kubenswrapper[4753]: E1005 21:27:35.434041 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:27:46 crc kubenswrapper[4753]: I1005 21:27:46.853392 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:27:46 crc kubenswrapper[4753]: E1005 21:27:46.854390 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:27:51 crc kubenswrapper[4753]: I1005 21:27:51.491633 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-568f5b5b96-6t6qd_75308018-c2d6-42ef-9776-f3b861ec86ed/barbican-api/0.log" Oct 05 21:27:51 crc kubenswrapper[4753]: I1005 21:27:51.593257 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-568f5b5b96-6t6qd_75308018-c2d6-42ef-9776-f3b861ec86ed/barbican-api-log/0.log" Oct 05 21:27:51 crc kubenswrapper[4753]: I1005 21:27:51.807638 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-fc7ffbf98-q7225_1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c/barbican-keystone-listener-log/0.log" Oct 05 21:27:51 crc kubenswrapper[4753]: I1005 21:27:51.845099 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-fc7ffbf98-q7225_1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c/barbican-keystone-listener/0.log" Oct 05 21:27:52 crc kubenswrapper[4753]: I1005 21:27:52.005227 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-c895b6649-pwwkn_4f08b26d-6499-4aac-93c6-7d07e4e98d47/barbican-worker/0.log" Oct 05 21:27:52 crc kubenswrapper[4753]: I1005 21:27:52.106775 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-c895b6649-pwwkn_4f08b26d-6499-4aac-93c6-7d07e4e98d47/barbican-worker-log/0.log" Oct 05 21:27:52 crc kubenswrapper[4753]: I1005 21:27:52.366342 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv_2e0a083e-4f35-4cbf-89af-348a03a81159/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:27:52 crc kubenswrapper[4753]: I1005 21:27:52.547456 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_68a4d310-a272-4033-af72-dfc6e8c239f6/ceilometer-central-agent/0.log" Oct 05 21:27:52 crc kubenswrapper[4753]: I1005 21:27:52.662339 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_68a4d310-a272-4033-af72-dfc6e8c239f6/ceilometer-notification-agent/0.log" Oct 05 21:27:52 crc kubenswrapper[4753]: I1005 21:27:52.694363 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_68a4d310-a272-4033-af72-dfc6e8c239f6/proxy-httpd/0.log" Oct 05 21:27:52 crc kubenswrapper[4753]: I1005 21:27:52.770233 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_68a4d310-a272-4033-af72-dfc6e8c239f6/sg-core/0.log" Oct 05 21:27:52 crc kubenswrapper[4753]: I1005 21:27:52.912244 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj_a3425a91-733d-43c0-b7af-42914da99374/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:27:53 crc kubenswrapper[4753]: I1005 21:27:53.200872 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42_73e27d2b-d430-4df5-9380-e3b3f6a75420/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:27:53 crc kubenswrapper[4753]: I1005 21:27:53.276944 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ba2c30bc-68b4-4803-852e-b12fe770196d/cinder-api/0.log" Oct 05 21:27:53 crc kubenswrapper[4753]: I1005 21:27:53.394984 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ba2c30bc-68b4-4803-852e-b12fe770196d/cinder-api-log/0.log" Oct 05 21:27:53 crc kubenswrapper[4753]: I1005 21:27:53.613016 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_adbbbc89-97ba-492f-a842-c9bf33a69480/cinder-backup/0.log" Oct 05 21:27:53 crc kubenswrapper[4753]: I1005 21:27:53.660512 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_adbbbc89-97ba-492f-a842-c9bf33a69480/probe/0.log" Oct 05 21:27:53 crc kubenswrapper[4753]: I1005 21:27:53.898830 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_56bdb919-1995-4b2a-855b-4d7ece37ce4c/probe/0.log" Oct 05 21:27:53 crc kubenswrapper[4753]: I1005 21:27:53.944887 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_56bdb919-1995-4b2a-855b-4d7ece37ce4c/cinder-scheduler/0.log" Oct 05 21:27:54 crc kubenswrapper[4753]: I1005 21:27:54.150479 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_45a5357e-d55a-4532-aaff-fe090b71fc60/cinder-volume/0.log" Oct 05 21:27:54 crc kubenswrapper[4753]: I1005 21:27:54.237171 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_45a5357e-d55a-4532-aaff-fe090b71fc60/probe/0.log" Oct 05 21:27:54 crc kubenswrapper[4753]: I1005 21:27:54.361599 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn_a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:27:54 crc kubenswrapper[4753]: I1005 21:27:54.796600 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj_f21504e5-2012-4b4a-a3fc-16e6dc364373/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:27:55 crc kubenswrapper[4753]: I1005 21:27:55.062176 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67948f47bf-jnd5v_16d16fc5-ebf6-49b5-a837-7b19a005ee21/init/0.log" Oct 05 21:27:55 crc kubenswrapper[4753]: I1005 21:27:55.203698 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67948f47bf-jnd5v_16d16fc5-ebf6-49b5-a837-7b19a005ee21/init/0.log" Oct 05 21:27:55 crc kubenswrapper[4753]: I1005 21:27:55.246080 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e4e4554e-b923-40f1-ac86-abc4cb871d21/glance-httpd/0.log" Oct 05 21:27:55 crc kubenswrapper[4753]: I1005 21:27:55.371855 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67948f47bf-jnd5v_16d16fc5-ebf6-49b5-a837-7b19a005ee21/dnsmasq-dns/0.log" Oct 05 21:27:55 crc kubenswrapper[4753]: I1005 21:27:55.485758 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e4e4554e-b923-40f1-ac86-abc4cb871d21/glance-log/0.log" Oct 05 21:27:55 crc kubenswrapper[4753]: I1005 21:27:55.511512 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4606d5be-d97d-4c1b-95df-1aad021ced17/glance-httpd/0.log" Oct 05 21:27:55 crc kubenswrapper[4753]: I1005 21:27:55.564245 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4606d5be-d97d-4c1b-95df-1aad021ced17/glance-log/0.log" Oct 05 21:27:55 crc kubenswrapper[4753]: I1005 21:27:55.890510 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-745b9fcf5d-xkxjq_e1309d62-7702-49bc-892f-705d8ac9fff3/horizon/0.log" Oct 05 21:27:55 crc kubenswrapper[4753]: I1005 21:27:55.892419 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-745b9fcf5d-xkxjq_e1309d62-7702-49bc-892f-705d8ac9fff3/horizon-log/0.log" Oct 05 21:27:55 crc kubenswrapper[4753]: I1005 21:27:55.981178 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd_13f3a6ea-8a17-4bf8-a252-f53e5856466a/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:27:56 crc kubenswrapper[4753]: I1005 21:27:56.114933 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4c5cn_d5a16b03-a799-4548-8a7f-bf73d3f4a52a/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:27:56 crc kubenswrapper[4753]: I1005 21:27:56.388310 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-85d68b7848-bh92h_bdbb6c59-3c98-4b88-a1aa-7304476a522a/keystone-api/0.log" Oct 05 21:27:56 crc kubenswrapper[4753]: I1005 21:27:56.850272 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29328301-rlwxs_383607d3-fca4-477a-a189-c6aab8192496/keystone-cron/0.log" Oct 05 21:27:56 crc kubenswrapper[4753]: I1005 21:27:56.902720 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5dd5b3b0-432b-4040-8544-d68497fca1de/kube-state-metrics/0.log" Oct 05 21:27:57 crc kubenswrapper[4753]: I1005 21:27:57.161172 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-nrg62_9f393cda-bc70-44d4-a534-a72b71dcf0b7/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:27:57 crc kubenswrapper[4753]: I1005 21:27:57.224381 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_d1597495-6f4a-4887-bacc-8082ad9784d4/manila-api/0.log" Oct 05 21:27:57 crc kubenswrapper[4753]: I1005 21:27:57.321370 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_d1597495-6f4a-4887-bacc-8082ad9784d4/manila-api-log/0.log" Oct 05 21:27:57 crc kubenswrapper[4753]: I1005 21:27:57.465578 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a/manila-scheduler/0.log" Oct 05 21:27:57 crc kubenswrapper[4753]: I1005 21:27:57.473121 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a/probe/0.log" Oct 05 21:27:57 crc kubenswrapper[4753]: I1005 21:27:57.638547 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_d8dd124d-011e-41dd-813b-b16ad8039461/manila-share/0.log" Oct 05 21:27:57 crc kubenswrapper[4753]: I1005 21:27:57.710878 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_d8dd124d-011e-41dd-813b-b16ad8039461/probe/0.log" Oct 05 21:27:58 crc kubenswrapper[4753]: I1005 21:27:58.137022 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-79cfb6d465-74j5v_e26c6617-558f-445a-be5b-02578e006437/neutron-api/0.log" Oct 05 21:27:58 crc kubenswrapper[4753]: I1005 21:27:58.151833 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-79cfb6d465-74j5v_e26c6617-558f-445a-be5b-02578e006437/neutron-httpd/0.log" Oct 05 21:27:58 crc kubenswrapper[4753]: I1005 21:27:58.402630 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q_e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:27:59 crc kubenswrapper[4753]: I1005 21:27:59.247920 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_fd2971d2-d61f-4268-9366-6e11ae7f71bc/nova-api-log/0.log" Oct 05 21:27:59 crc kubenswrapper[4753]: I1005 21:27:59.270069 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_d8ddb5b3-36ec-421d-a5d0-f465f7cf0316/nova-cell0-conductor-conductor/0.log" Oct 05 21:27:59 crc kubenswrapper[4753]: I1005 21:27:59.330203 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_fd2971d2-d61f-4268-9366-6e11ae7f71bc/nova-api-api/0.log" Oct 05 21:27:59 crc kubenswrapper[4753]: I1005 21:27:59.602524 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_ec681529-93c2-4792-8e1e-ccbc696ed9ee/nova-cell1-conductor-conductor/0.log" Oct 05 21:28:00 crc kubenswrapper[4753]: I1005 21:28:00.014875 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_be546d4c-4192-4338-aaf3-2849807daf9d/nova-cell1-novncproxy-novncproxy/0.log" Oct 05 21:28:00 crc kubenswrapper[4753]: I1005 21:28:00.054397 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk_db911cf0-3e57-45a3-a1ce-06f5260745b4/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:28:00 crc kubenswrapper[4753]: I1005 21:28:00.451199 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3649c567-6d73-4afe-a1aa-d6621a5cc89f/nova-metadata-log/0.log" Oct 05 21:28:00 crc kubenswrapper[4753]: I1005 21:28:00.847555 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_346a135f-f1af-4968-9c9f-4540f2a71161/nova-scheduler-scheduler/0.log" Oct 05 21:28:00 crc kubenswrapper[4753]: I1005 21:28:00.851823 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:28:00 crc kubenswrapper[4753]: E1005 21:28:00.852123 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:28:01 crc kubenswrapper[4753]: I1005 21:28:01.221594 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c20016b6-f321-4cf2-b09a-d35b96c85805/mysql-bootstrap/0.log" Oct 05 21:28:01 crc kubenswrapper[4753]: I1005 21:28:01.265113 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c20016b6-f321-4cf2-b09a-d35b96c85805/mysql-bootstrap/0.log" Oct 05 21:28:01 crc kubenswrapper[4753]: I1005 21:28:01.515238 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c20016b6-f321-4cf2-b09a-d35b96c85805/galera/0.log" Oct 05 21:28:01 crc kubenswrapper[4753]: I1005 21:28:01.762897 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_81bd134f-0bbd-4cda-b29c-d4d514d4dbe7/mysql-bootstrap/0.log" Oct 05 21:28:01 crc kubenswrapper[4753]: I1005 21:28:01.989704 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3649c567-6d73-4afe-a1aa-d6621a5cc89f/nova-metadata-metadata/0.log" Oct 05 21:28:02 crc kubenswrapper[4753]: I1005 21:28:02.129272 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_81bd134f-0bbd-4cda-b29c-d4d514d4dbe7/galera/0.log" Oct 05 21:28:02 crc kubenswrapper[4753]: I1005 21:28:02.168550 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_81bd134f-0bbd-4cda-b29c-d4d514d4dbe7/mysql-bootstrap/0.log" Oct 05 21:28:02 crc kubenswrapper[4753]: I1005 21:28:02.361078 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ae2325dc-1d53-4605-84d9-c5a341d6c311/openstackclient/0.log" Oct 05 21:28:02 crc kubenswrapper[4753]: I1005 21:28:02.599516 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-7zxq7_61f845cb-9404-421b-b20f-9dee4edd00f8/ovn-controller/0.log" Oct 05 21:28:02 crc kubenswrapper[4753]: I1005 21:28:02.870395 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5xx4p_2ba751b9-51bf-4c3d-8a7a-35af6ebe354f/openstack-network-exporter/0.log" Oct 05 21:28:02 crc kubenswrapper[4753]: I1005 21:28:02.996404 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8krg4_61cd2b9f-f08f-4b47-be04-1d9246a5cbdb/ovsdb-server-init/0.log" Oct 05 21:28:03 crc kubenswrapper[4753]: I1005 21:28:03.429535 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8krg4_61cd2b9f-f08f-4b47-be04-1d9246a5cbdb/ovsdb-server-init/0.log" Oct 05 21:28:03 crc kubenswrapper[4753]: I1005 21:28:03.481809 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8krg4_61cd2b9f-f08f-4b47-be04-1d9246a5cbdb/ovsdb-server/0.log" Oct 05 21:28:03 crc kubenswrapper[4753]: I1005 21:28:03.549815 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8krg4_61cd2b9f-f08f-4b47-be04-1d9246a5cbdb/ovs-vswitchd/0.log" Oct 05 21:28:03 crc kubenswrapper[4753]: I1005 21:28:03.754424 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-btv49_103811f6-8ae0-475f-878b-0c5c615265ee/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:28:03 crc kubenswrapper[4753]: I1005 21:28:03.955926 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0410bd72-9899-4174-9258-4efbdc6cd7c8/openstack-network-exporter/0.log" Oct 05 21:28:03 crc kubenswrapper[4753]: I1005 21:28:03.956466 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0410bd72-9899-4174-9258-4efbdc6cd7c8/ovn-northd/0.log" Oct 05 21:28:04 crc kubenswrapper[4753]: I1005 21:28:04.292169 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_19914a64-715e-4a20-82fc-f4e86b8e9e21/ovsdbserver-nb/0.log" Oct 05 21:28:04 crc kubenswrapper[4753]: I1005 21:28:04.327210 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_19914a64-715e-4a20-82fc-f4e86b8e9e21/openstack-network-exporter/0.log" Oct 05 21:28:05 crc kubenswrapper[4753]: I1005 21:28:05.103275 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_396e74f9-75f0-4643-a011-da8c56174984/ovsdbserver-sb/0.log" Oct 05 21:28:05 crc kubenswrapper[4753]: I1005 21:28:05.169484 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_396e74f9-75f0-4643-a011-da8c56174984/openstack-network-exporter/0.log" Oct 05 21:28:05 crc kubenswrapper[4753]: I1005 21:28:05.462406 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-78454fb4-ktvqp_131ce515-ac42-4446-b075-5e50254e6686/placement-api/0.log" Oct 05 21:28:05 crc kubenswrapper[4753]: I1005 21:28:05.498604 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-78454fb4-ktvqp_131ce515-ac42-4446-b075-5e50254e6686/placement-log/0.log" Oct 05 21:28:05 crc kubenswrapper[4753]: I1005 21:28:05.669378 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0022b5ba-c84b-4ee1-84a5-8e04d7c4d330/setup-container/0.log" Oct 05 21:28:05 crc kubenswrapper[4753]: I1005 21:28:05.972220 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0022b5ba-c84b-4ee1-84a5-8e04d7c4d330/setup-container/0.log" Oct 05 21:28:05 crc kubenswrapper[4753]: I1005 21:28:05.994792 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0022b5ba-c84b-4ee1-84a5-8e04d7c4d330/rabbitmq/0.log" Oct 05 21:28:06 crc kubenswrapper[4753]: I1005 21:28:06.113257 4753 scope.go:117] "RemoveContainer" containerID="421a75d7b8d2956abee7e087a3478e0c2db295afbc1fe1d5e8c17095478fec4f" Oct 05 21:28:06 crc kubenswrapper[4753]: I1005 21:28:06.159773 4753 scope.go:117] "RemoveContainer" containerID="2ee65d62eaaf9b52d2c8146fe83824b7c08cc2726d4bb57c9532788135dcd62a" Oct 05 21:28:06 crc kubenswrapper[4753]: I1005 21:28:06.203592 4753 scope.go:117] "RemoveContainer" containerID="3850a0a21da018d017682f11a467bbdac2ae6ccdd20cb80563a7dfa0a63ad791" Oct 05 21:28:06 crc kubenswrapper[4753]: I1005 21:28:06.269639 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_468c6dc5-e196-4084-9211-d2b06253832d/setup-container/0.log" Oct 05 21:28:06 crc kubenswrapper[4753]: I1005 21:28:06.398558 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_468c6dc5-e196-4084-9211-d2b06253832d/setup-container/0.log" Oct 05 21:28:06 crc kubenswrapper[4753]: I1005 21:28:06.564153 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_468c6dc5-e196-4084-9211-d2b06253832d/rabbitmq/0.log" Oct 05 21:28:06 crc kubenswrapper[4753]: I1005 21:28:06.648430 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s_dd50c8ec-c247-4691-9c6d-6c72c1e89227/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:28:06 crc kubenswrapper[4753]: I1005 21:28:06.902240 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk_91285735-785c-4889-9913-bb3e58ffed5f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:28:07 crc kubenswrapper[4753]: I1005 21:28:07.051005 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-vqmtg_1073a302-b108-4caa-aa77-78d64fd8f169/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:28:07 crc kubenswrapper[4753]: I1005 21:28:07.260928 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-n4pww_5d24d938-36bb-4d7b-94e6-f0332f50a71a/ssh-known-hosts-edpm-deployment/0.log" Oct 05 21:28:07 crc kubenswrapper[4753]: I1005 21:28:07.523177 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_989178f4-ef23-49c1-88f8-10babb448a68/tempest-tests-tempest-tests-runner/0.log" Oct 05 21:28:07 crc kubenswrapper[4753]: I1005 21:28:07.722251 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_cf30bb5c-e675-411f-b50d-77dff81c83af/test-operator-logs-container/0.log" Oct 05 21:28:07 crc kubenswrapper[4753]: I1005 21:28:07.856037 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-xs45v_e9938f80-4e3c-476e-bd1d-11e1646d9176/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:28:13 crc kubenswrapper[4753]: I1005 21:28:13.776127 4753 generic.go:334] "Generic (PLEG): container finished" podID="18f82263-d631-4623-a434-b940a6cb65c0" containerID="cce6d019711360b0e1303c6875bbb315880b2d1b7fb0b6d6fcf150422bb10631" exitCode=0 Oct 05 21:28:13 crc kubenswrapper[4753]: I1005 21:28:13.776216 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-txk7h/crc-debug-4dsc6" event={"ID":"18f82263-d631-4623-a434-b940a6cb65c0","Type":"ContainerDied","Data":"cce6d019711360b0e1303c6875bbb315880b2d1b7fb0b6d6fcf150422bb10631"} Oct 05 21:28:13 crc kubenswrapper[4753]: I1005 21:28:13.852170 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:28:13 crc kubenswrapper[4753]: E1005 21:28:13.852470 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:28:14 crc kubenswrapper[4753]: I1005 21:28:14.889795 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-txk7h/crc-debug-4dsc6" Oct 05 21:28:14 crc kubenswrapper[4753]: I1005 21:28:14.933764 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-txk7h/crc-debug-4dsc6"] Oct 05 21:28:14 crc kubenswrapper[4753]: I1005 21:28:14.946116 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-txk7h/crc-debug-4dsc6"] Oct 05 21:28:14 crc kubenswrapper[4753]: I1005 21:28:14.982835 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7grsf\" (UniqueName: \"kubernetes.io/projected/18f82263-d631-4623-a434-b940a6cb65c0-kube-api-access-7grsf\") pod \"18f82263-d631-4623-a434-b940a6cb65c0\" (UID: \"18f82263-d631-4623-a434-b940a6cb65c0\") " Oct 05 21:28:14 crc kubenswrapper[4753]: I1005 21:28:14.982914 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18f82263-d631-4623-a434-b940a6cb65c0-host\") pod \"18f82263-d631-4623-a434-b940a6cb65c0\" (UID: \"18f82263-d631-4623-a434-b940a6cb65c0\") " Oct 05 21:28:14 crc kubenswrapper[4753]: I1005 21:28:14.983236 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18f82263-d631-4623-a434-b940a6cb65c0-host" (OuterVolumeSpecName: "host") pod "18f82263-d631-4623-a434-b940a6cb65c0" (UID: "18f82263-d631-4623-a434-b940a6cb65c0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 21:28:14 crc kubenswrapper[4753]: I1005 21:28:14.983626 4753 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18f82263-d631-4623-a434-b940a6cb65c0-host\") on node \"crc\" DevicePath \"\"" Oct 05 21:28:15 crc kubenswrapper[4753]: I1005 21:28:15.014320 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18f82263-d631-4623-a434-b940a6cb65c0-kube-api-access-7grsf" (OuterVolumeSpecName: "kube-api-access-7grsf") pod "18f82263-d631-4623-a434-b940a6cb65c0" (UID: "18f82263-d631-4623-a434-b940a6cb65c0"). InnerVolumeSpecName "kube-api-access-7grsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:28:15 crc kubenswrapper[4753]: I1005 21:28:15.085614 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7grsf\" (UniqueName: \"kubernetes.io/projected/18f82263-d631-4623-a434-b940a6cb65c0-kube-api-access-7grsf\") on node \"crc\" DevicePath \"\"" Oct 05 21:28:15 crc kubenswrapper[4753]: I1005 21:28:15.794980 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3093a0bad3a92a07d132516f04a8e8c64d313b0b86840ade7878797ed87179c" Oct 05 21:28:15 crc kubenswrapper[4753]: I1005 21:28:15.795262 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-txk7h/crc-debug-4dsc6" Oct 05 21:28:15 crc kubenswrapper[4753]: I1005 21:28:15.863925 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18f82263-d631-4623-a434-b940a6cb65c0" path="/var/lib/kubelet/pods/18f82263-d631-4623-a434-b940a6cb65c0/volumes" Oct 05 21:28:16 crc kubenswrapper[4753]: I1005 21:28:16.155560 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-txk7h/crc-debug-b4nn8"] Oct 05 21:28:16 crc kubenswrapper[4753]: E1005 21:28:16.156023 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f82263-d631-4623-a434-b940a6cb65c0" containerName="container-00" Oct 05 21:28:16 crc kubenswrapper[4753]: I1005 21:28:16.156035 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f82263-d631-4623-a434-b940a6cb65c0" containerName="container-00" Oct 05 21:28:16 crc kubenswrapper[4753]: I1005 21:28:16.156243 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f82263-d631-4623-a434-b940a6cb65c0" containerName="container-00" Oct 05 21:28:16 crc kubenswrapper[4753]: I1005 21:28:16.156873 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-txk7h/crc-debug-b4nn8" Oct 05 21:28:16 crc kubenswrapper[4753]: I1005 21:28:16.331306 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfe0bdb4-1a8a-4d4e-9681-6e440060685d-host\") pod \"crc-debug-b4nn8\" (UID: \"bfe0bdb4-1a8a-4d4e-9681-6e440060685d\") " pod="openshift-must-gather-txk7h/crc-debug-b4nn8" Oct 05 21:28:16 crc kubenswrapper[4753]: I1005 21:28:16.331454 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7l4x\" (UniqueName: \"kubernetes.io/projected/bfe0bdb4-1a8a-4d4e-9681-6e440060685d-kube-api-access-j7l4x\") pod \"crc-debug-b4nn8\" (UID: \"bfe0bdb4-1a8a-4d4e-9681-6e440060685d\") " pod="openshift-must-gather-txk7h/crc-debug-b4nn8" Oct 05 21:28:16 crc kubenswrapper[4753]: I1005 21:28:16.432755 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7l4x\" (UniqueName: \"kubernetes.io/projected/bfe0bdb4-1a8a-4d4e-9681-6e440060685d-kube-api-access-j7l4x\") pod \"crc-debug-b4nn8\" (UID: \"bfe0bdb4-1a8a-4d4e-9681-6e440060685d\") " pod="openshift-must-gather-txk7h/crc-debug-b4nn8" Oct 05 21:28:16 crc kubenswrapper[4753]: I1005 21:28:16.432837 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfe0bdb4-1a8a-4d4e-9681-6e440060685d-host\") pod \"crc-debug-b4nn8\" (UID: \"bfe0bdb4-1a8a-4d4e-9681-6e440060685d\") " pod="openshift-must-gather-txk7h/crc-debug-b4nn8" Oct 05 21:28:16 crc kubenswrapper[4753]: I1005 21:28:16.432936 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfe0bdb4-1a8a-4d4e-9681-6e440060685d-host\") pod \"crc-debug-b4nn8\" (UID: \"bfe0bdb4-1a8a-4d4e-9681-6e440060685d\") " pod="openshift-must-gather-txk7h/crc-debug-b4nn8" Oct 05 21:28:16 crc kubenswrapper[4753]: I1005 21:28:16.450297 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7l4x\" (UniqueName: \"kubernetes.io/projected/bfe0bdb4-1a8a-4d4e-9681-6e440060685d-kube-api-access-j7l4x\") pod \"crc-debug-b4nn8\" (UID: \"bfe0bdb4-1a8a-4d4e-9681-6e440060685d\") " pod="openshift-must-gather-txk7h/crc-debug-b4nn8" Oct 05 21:28:16 crc kubenswrapper[4753]: I1005 21:28:16.474621 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-txk7h/crc-debug-b4nn8" Oct 05 21:28:16 crc kubenswrapper[4753]: I1005 21:28:16.838289 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-txk7h/crc-debug-b4nn8" event={"ID":"bfe0bdb4-1a8a-4d4e-9681-6e440060685d","Type":"ContainerStarted","Data":"afcf5e39462ad9e9dd9fc975a232464f3125f5a134ea7a8f4160135f542c4f53"} Oct 05 21:28:17 crc kubenswrapper[4753]: I1005 21:28:17.224771 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_5091b95a-0011-45bb-b4b8-be273f03f7b4/memcached/0.log" Oct 05 21:28:17 crc kubenswrapper[4753]: I1005 21:28:17.846573 4753 generic.go:334] "Generic (PLEG): container finished" podID="bfe0bdb4-1a8a-4d4e-9681-6e440060685d" containerID="c3123e275c3787d7febe754fc369d3797c33babaa22b774cfe4ca3de5bf8c8eb" exitCode=0 Oct 05 21:28:17 crc kubenswrapper[4753]: I1005 21:28:17.846863 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-txk7h/crc-debug-b4nn8" event={"ID":"bfe0bdb4-1a8a-4d4e-9681-6e440060685d","Type":"ContainerDied","Data":"c3123e275c3787d7febe754fc369d3797c33babaa22b774cfe4ca3de5bf8c8eb"} Oct 05 21:28:18 crc kubenswrapper[4753]: I1005 21:28:18.957490 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-txk7h/crc-debug-b4nn8" Oct 05 21:28:19 crc kubenswrapper[4753]: I1005 21:28:19.096093 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7l4x\" (UniqueName: \"kubernetes.io/projected/bfe0bdb4-1a8a-4d4e-9681-6e440060685d-kube-api-access-j7l4x\") pod \"bfe0bdb4-1a8a-4d4e-9681-6e440060685d\" (UID: \"bfe0bdb4-1a8a-4d4e-9681-6e440060685d\") " Oct 05 21:28:19 crc kubenswrapper[4753]: I1005 21:28:19.096240 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfe0bdb4-1a8a-4d4e-9681-6e440060685d-host\") pod \"bfe0bdb4-1a8a-4d4e-9681-6e440060685d\" (UID: \"bfe0bdb4-1a8a-4d4e-9681-6e440060685d\") " Oct 05 21:28:19 crc kubenswrapper[4753]: I1005 21:28:19.096742 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfe0bdb4-1a8a-4d4e-9681-6e440060685d-host" (OuterVolumeSpecName: "host") pod "bfe0bdb4-1a8a-4d4e-9681-6e440060685d" (UID: "bfe0bdb4-1a8a-4d4e-9681-6e440060685d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 21:28:19 crc kubenswrapper[4753]: I1005 21:28:19.104203 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe0bdb4-1a8a-4d4e-9681-6e440060685d-kube-api-access-j7l4x" (OuterVolumeSpecName: "kube-api-access-j7l4x") pod "bfe0bdb4-1a8a-4d4e-9681-6e440060685d" (UID: "bfe0bdb4-1a8a-4d4e-9681-6e440060685d"). InnerVolumeSpecName "kube-api-access-j7l4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:28:19 crc kubenswrapper[4753]: I1005 21:28:19.198179 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7l4x\" (UniqueName: \"kubernetes.io/projected/bfe0bdb4-1a8a-4d4e-9681-6e440060685d-kube-api-access-j7l4x\") on node \"crc\" DevicePath \"\"" Oct 05 21:28:19 crc kubenswrapper[4753]: I1005 21:28:19.198393 4753 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfe0bdb4-1a8a-4d4e-9681-6e440060685d-host\") on node \"crc\" DevicePath \"\"" Oct 05 21:28:19 crc kubenswrapper[4753]: I1005 21:28:19.865326 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-txk7h/crc-debug-b4nn8" event={"ID":"bfe0bdb4-1a8a-4d4e-9681-6e440060685d","Type":"ContainerDied","Data":"afcf5e39462ad9e9dd9fc975a232464f3125f5a134ea7a8f4160135f542c4f53"} Oct 05 21:28:19 crc kubenswrapper[4753]: I1005 21:28:19.865569 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afcf5e39462ad9e9dd9fc975a232464f3125f5a134ea7a8f4160135f542c4f53" Oct 05 21:28:19 crc kubenswrapper[4753]: I1005 21:28:19.865609 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-txk7h/crc-debug-b4nn8" Oct 05 21:28:23 crc kubenswrapper[4753]: I1005 21:28:23.276325 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-txk7h/crc-debug-b4nn8"] Oct 05 21:28:23 crc kubenswrapper[4753]: I1005 21:28:23.341645 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-txk7h/crc-debug-b4nn8"] Oct 05 21:28:23 crc kubenswrapper[4753]: I1005 21:28:23.862544 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfe0bdb4-1a8a-4d4e-9681-6e440060685d" path="/var/lib/kubelet/pods/bfe0bdb4-1a8a-4d4e-9681-6e440060685d/volumes" Oct 05 21:28:24 crc kubenswrapper[4753]: I1005 21:28:24.571425 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-txk7h/crc-debug-8kkzs"] Oct 05 21:28:24 crc kubenswrapper[4753]: E1005 21:28:24.572014 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe0bdb4-1a8a-4d4e-9681-6e440060685d" containerName="container-00" Oct 05 21:28:24 crc kubenswrapper[4753]: I1005 21:28:24.572026 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe0bdb4-1a8a-4d4e-9681-6e440060685d" containerName="container-00" Oct 05 21:28:24 crc kubenswrapper[4753]: I1005 21:28:24.572214 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe0bdb4-1a8a-4d4e-9681-6e440060685d" containerName="container-00" Oct 05 21:28:24 crc kubenswrapper[4753]: I1005 21:28:24.572773 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-txk7h/crc-debug-8kkzs" Oct 05 21:28:24 crc kubenswrapper[4753]: I1005 21:28:24.700565 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8eb2fb36-2d32-4134-b83f-aa85b45d0fb4-host\") pod \"crc-debug-8kkzs\" (UID: \"8eb2fb36-2d32-4134-b83f-aa85b45d0fb4\") " pod="openshift-must-gather-txk7h/crc-debug-8kkzs" Oct 05 21:28:24 crc kubenswrapper[4753]: I1005 21:28:24.700651 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlqv2\" (UniqueName: \"kubernetes.io/projected/8eb2fb36-2d32-4134-b83f-aa85b45d0fb4-kube-api-access-qlqv2\") pod \"crc-debug-8kkzs\" (UID: \"8eb2fb36-2d32-4134-b83f-aa85b45d0fb4\") " pod="openshift-must-gather-txk7h/crc-debug-8kkzs" Oct 05 21:28:24 crc kubenswrapper[4753]: I1005 21:28:24.801957 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8eb2fb36-2d32-4134-b83f-aa85b45d0fb4-host\") pod \"crc-debug-8kkzs\" (UID: \"8eb2fb36-2d32-4134-b83f-aa85b45d0fb4\") " pod="openshift-must-gather-txk7h/crc-debug-8kkzs" Oct 05 21:28:24 crc kubenswrapper[4753]: I1005 21:28:24.802023 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlqv2\" (UniqueName: \"kubernetes.io/projected/8eb2fb36-2d32-4134-b83f-aa85b45d0fb4-kube-api-access-qlqv2\") pod \"crc-debug-8kkzs\" (UID: \"8eb2fb36-2d32-4134-b83f-aa85b45d0fb4\") " pod="openshift-must-gather-txk7h/crc-debug-8kkzs" Oct 05 21:28:24 crc kubenswrapper[4753]: I1005 21:28:24.802122 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8eb2fb36-2d32-4134-b83f-aa85b45d0fb4-host\") pod \"crc-debug-8kkzs\" (UID: \"8eb2fb36-2d32-4134-b83f-aa85b45d0fb4\") " pod="openshift-must-gather-txk7h/crc-debug-8kkzs" Oct 05 21:28:24 crc kubenswrapper[4753]: I1005 21:28:24.853355 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:28:24 crc kubenswrapper[4753]: E1005 21:28:24.853590 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:28:24 crc kubenswrapper[4753]: I1005 21:28:24.853892 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlqv2\" (UniqueName: \"kubernetes.io/projected/8eb2fb36-2d32-4134-b83f-aa85b45d0fb4-kube-api-access-qlqv2\") pod \"crc-debug-8kkzs\" (UID: \"8eb2fb36-2d32-4134-b83f-aa85b45d0fb4\") " pod="openshift-must-gather-txk7h/crc-debug-8kkzs" Oct 05 21:28:24 crc kubenswrapper[4753]: I1005 21:28:24.889254 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-txk7h/crc-debug-8kkzs" Oct 05 21:28:25 crc kubenswrapper[4753]: I1005 21:28:25.926671 4753 generic.go:334] "Generic (PLEG): container finished" podID="8eb2fb36-2d32-4134-b83f-aa85b45d0fb4" containerID="573ae85e6480bc4d95aeabf2e487ee1f52900f8664ff934105702077f97241ed" exitCode=0 Oct 05 21:28:25 crc kubenswrapper[4753]: I1005 21:28:25.926763 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-txk7h/crc-debug-8kkzs" event={"ID":"8eb2fb36-2d32-4134-b83f-aa85b45d0fb4","Type":"ContainerDied","Data":"573ae85e6480bc4d95aeabf2e487ee1f52900f8664ff934105702077f97241ed"} Oct 05 21:28:25 crc kubenswrapper[4753]: I1005 21:28:25.926959 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-txk7h/crc-debug-8kkzs" event={"ID":"8eb2fb36-2d32-4134-b83f-aa85b45d0fb4","Type":"ContainerStarted","Data":"bd54bedf686ae41d56d78265de2491efbd14f951f5ddcffed160d7fd1ceebab6"} Oct 05 21:28:25 crc kubenswrapper[4753]: I1005 21:28:25.967871 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-txk7h/crc-debug-8kkzs"] Oct 05 21:28:25 crc kubenswrapper[4753]: I1005 21:28:25.976890 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-txk7h/crc-debug-8kkzs"] Oct 05 21:28:27 crc kubenswrapper[4753]: I1005 21:28:27.029834 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-txk7h/crc-debug-8kkzs" Oct 05 21:28:27 crc kubenswrapper[4753]: I1005 21:28:27.152398 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlqv2\" (UniqueName: \"kubernetes.io/projected/8eb2fb36-2d32-4134-b83f-aa85b45d0fb4-kube-api-access-qlqv2\") pod \"8eb2fb36-2d32-4134-b83f-aa85b45d0fb4\" (UID: \"8eb2fb36-2d32-4134-b83f-aa85b45d0fb4\") " Oct 05 21:28:27 crc kubenswrapper[4753]: I1005 21:28:27.152549 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8eb2fb36-2d32-4134-b83f-aa85b45d0fb4-host\") pod \"8eb2fb36-2d32-4134-b83f-aa85b45d0fb4\" (UID: \"8eb2fb36-2d32-4134-b83f-aa85b45d0fb4\") " Oct 05 21:28:27 crc kubenswrapper[4753]: I1005 21:28:27.152685 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8eb2fb36-2d32-4134-b83f-aa85b45d0fb4-host" (OuterVolumeSpecName: "host") pod "8eb2fb36-2d32-4134-b83f-aa85b45d0fb4" (UID: "8eb2fb36-2d32-4134-b83f-aa85b45d0fb4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 21:28:27 crc kubenswrapper[4753]: I1005 21:28:27.153100 4753 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8eb2fb36-2d32-4134-b83f-aa85b45d0fb4-host\") on node \"crc\" DevicePath \"\"" Oct 05 21:28:27 crc kubenswrapper[4753]: I1005 21:28:27.158305 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eb2fb36-2d32-4134-b83f-aa85b45d0fb4-kube-api-access-qlqv2" (OuterVolumeSpecName: "kube-api-access-qlqv2") pod "8eb2fb36-2d32-4134-b83f-aa85b45d0fb4" (UID: "8eb2fb36-2d32-4134-b83f-aa85b45d0fb4"). InnerVolumeSpecName "kube-api-access-qlqv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:28:27 crc kubenswrapper[4753]: I1005 21:28:27.254620 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlqv2\" (UniqueName: \"kubernetes.io/projected/8eb2fb36-2d32-4134-b83f-aa85b45d0fb4-kube-api-access-qlqv2\") on node \"crc\" DevicePath \"\"" Oct 05 21:28:27 crc kubenswrapper[4753]: I1005 21:28:27.881922 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eb2fb36-2d32-4134-b83f-aa85b45d0fb4" path="/var/lib/kubelet/pods/8eb2fb36-2d32-4134-b83f-aa85b45d0fb4/volumes" Oct 05 21:28:27 crc kubenswrapper[4753]: I1005 21:28:27.944175 4753 scope.go:117] "RemoveContainer" containerID="573ae85e6480bc4d95aeabf2e487ee1f52900f8664ff934105702077f97241ed" Oct 05 21:28:27 crc kubenswrapper[4753]: I1005 21:28:27.944756 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-txk7h/crc-debug-8kkzs" Oct 05 21:28:36 crc kubenswrapper[4753]: I1005 21:28:36.852491 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:28:36 crc kubenswrapper[4753]: E1005 21:28:36.853104 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:28:40 crc kubenswrapper[4753]: I1005 21:28:40.863672 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574_95cb6a2f-3fb4-43d6-b342-c17059bf0e73/util/0.log" Oct 05 21:28:41 crc kubenswrapper[4753]: I1005 21:28:41.073789 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574_95cb6a2f-3fb4-43d6-b342-c17059bf0e73/util/0.log" Oct 05 21:28:41 crc kubenswrapper[4753]: I1005 21:28:41.075667 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574_95cb6a2f-3fb4-43d6-b342-c17059bf0e73/pull/0.log" Oct 05 21:28:41 crc kubenswrapper[4753]: I1005 21:28:41.092802 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574_95cb6a2f-3fb4-43d6-b342-c17059bf0e73/pull/0.log" Oct 05 21:28:41 crc kubenswrapper[4753]: I1005 21:28:41.342629 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574_95cb6a2f-3fb4-43d6-b342-c17059bf0e73/pull/0.log" Oct 05 21:28:41 crc kubenswrapper[4753]: I1005 21:28:41.349364 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574_95cb6a2f-3fb4-43d6-b342-c17059bf0e73/util/0.log" Oct 05 21:28:41 crc kubenswrapper[4753]: I1005 21:28:41.400638 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574_95cb6a2f-3fb4-43d6-b342-c17059bf0e73/extract/0.log" Oct 05 21:28:41 crc kubenswrapper[4753]: I1005 21:28:41.609565 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5b974f6766-wsbjp_12ff014d-81e6-4a9e-8197-e28fbfc4a06e/kube-rbac-proxy/0.log" Oct 05 21:28:41 crc kubenswrapper[4753]: I1005 21:28:41.680941 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-84bd8f6848-p5g48_64896158-a10b-4fd9-b232-5ba3fa647a02/kube-rbac-proxy/0.log" Oct 05 21:28:41 crc kubenswrapper[4753]: I1005 21:28:41.706788 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5b974f6766-wsbjp_12ff014d-81e6-4a9e-8197-e28fbfc4a06e/manager/0.log" Oct 05 21:28:41 crc kubenswrapper[4753]: I1005 21:28:41.911972 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-4wqxw_9286136d-f0a7-4488-b346-2b3ea3ab81da/kube-rbac-proxy/0.log" Oct 05 21:28:41 crc kubenswrapper[4753]: I1005 21:28:41.935524 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-84bd8f6848-p5g48_64896158-a10b-4fd9-b232-5ba3fa647a02/manager/0.log" Oct 05 21:28:41 crc kubenswrapper[4753]: I1005 21:28:41.970027 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-4wqxw_9286136d-f0a7-4488-b346-2b3ea3ab81da/manager/0.log" Oct 05 21:28:42 crc kubenswrapper[4753]: I1005 21:28:42.157478 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-698456cdc6-tnt6j_d8c88aaa-c54b-4f65-be07-61e23d5a5cd4/kube-rbac-proxy/0.log" Oct 05 21:28:42 crc kubenswrapper[4753]: I1005 21:28:42.241268 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-698456cdc6-tnt6j_d8c88aaa-c54b-4f65-be07-61e23d5a5cd4/manager/0.log" Oct 05 21:28:42 crc kubenswrapper[4753]: I1005 21:28:42.522878 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5c497dbdb-txd4b_5b8831b7-9250-4ec8-b732-2db04e507cfe/manager/0.log" Oct 05 21:28:42 crc kubenswrapper[4753]: I1005 21:28:42.523422 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5c497dbdb-txd4b_5b8831b7-9250-4ec8-b732-2db04e507cfe/kube-rbac-proxy/0.log" Oct 05 21:28:42 crc kubenswrapper[4753]: I1005 21:28:42.628890 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6675647785-dqfcj_885f705b-599d-41fe-92cf-ffd000ad5e6e/kube-rbac-proxy/0.log" Oct 05 21:28:42 crc kubenswrapper[4753]: I1005 21:28:42.763765 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6675647785-dqfcj_885f705b-599d-41fe-92cf-ffd000ad5e6e/manager/0.log" Oct 05 21:28:42 crc kubenswrapper[4753]: I1005 21:28:42.805373 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84788b6bc5-vksxs_f241e98d-8f7c-492a-a4bc-988dc78b6449/kube-rbac-proxy/0.log" Oct 05 21:28:43 crc kubenswrapper[4753]: I1005 21:28:43.017515 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f5894c49f-ct2l6_266b0921-1164-46bc-9e78-986f5ded5943/kube-rbac-proxy/0.log" Oct 05 21:28:43 crc kubenswrapper[4753]: I1005 21:28:43.031714 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84788b6bc5-vksxs_f241e98d-8f7c-492a-a4bc-988dc78b6449/manager/0.log" Oct 05 21:28:43 crc kubenswrapper[4753]: I1005 21:28:43.135519 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f5894c49f-ct2l6_266b0921-1164-46bc-9e78-986f5ded5943/manager/0.log" Oct 05 21:28:43 crc kubenswrapper[4753]: I1005 21:28:43.307386 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-57c9cdcf57-9kjpf_dd3487ac-89f8-40f1-967e-71f7fada0fe1/kube-rbac-proxy/0.log" Oct 05 21:28:43 crc kubenswrapper[4753]: I1005 21:28:43.307439 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-57c9cdcf57-9kjpf_dd3487ac-89f8-40f1-967e-71f7fada0fe1/manager/0.log" Oct 05 21:28:43 crc kubenswrapper[4753]: I1005 21:28:43.840224 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7cb48dbc-f4dp5_0c5e8f9b-e10e-436b-ae33-07a7350f02a1/kube-rbac-proxy/0.log" Oct 05 21:28:43 crc kubenswrapper[4753]: I1005 21:28:43.880268 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7cb48dbc-f4dp5_0c5e8f9b-e10e-436b-ae33-07a7350f02a1/manager/0.log" Oct 05 21:28:43 crc kubenswrapper[4753]: I1005 21:28:43.982730 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-d6c9dc5bc-rlsgp_00eefbb7-989e-478d-aad3-ff4d236168f2/kube-rbac-proxy/0.log" Oct 05 21:28:44 crc kubenswrapper[4753]: I1005 21:28:44.136950 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-d6c9dc5bc-rlsgp_00eefbb7-989e-478d-aad3-ff4d236168f2/manager/0.log" Oct 05 21:28:44 crc kubenswrapper[4753]: I1005 21:28:44.250669 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-69b956fbf6-vtgfd_9c8b9aa1-e15e-475d-a02e-56b430d50bd1/manager/0.log" Oct 05 21:28:44 crc kubenswrapper[4753]: I1005 21:28:44.283723 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-69b956fbf6-vtgfd_9c8b9aa1-e15e-475d-a02e-56b430d50bd1/kube-rbac-proxy/0.log" Oct 05 21:28:44 crc kubenswrapper[4753]: I1005 21:28:44.495409 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6c9b57c67-cp4qf_3e514d87-9323-4c3b-a372-60e5c65fa731/kube-rbac-proxy/0.log" Oct 05 21:28:44 crc kubenswrapper[4753]: I1005 21:28:44.596037 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6c9b57c67-cp4qf_3e514d87-9323-4c3b-a372-60e5c65fa731/manager/0.log" Oct 05 21:28:44 crc kubenswrapper[4753]: I1005 21:28:44.676410 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f59f9d8-htstg_d58d3fcd-368b-4d73-8c29-a181f3bdddee/kube-rbac-proxy/0.log" Oct 05 21:28:44 crc kubenswrapper[4753]: I1005 21:28:44.698928 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f59f9d8-htstg_d58d3fcd-368b-4d73-8c29-a181f3bdddee/manager/0.log" Oct 05 21:28:44 crc kubenswrapper[4753]: I1005 21:28:44.823476 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm_28d42154-af7b-440b-af1b-2ef50ee9edca/kube-rbac-proxy/0.log" Oct 05 21:28:44 crc kubenswrapper[4753]: I1005 21:28:44.976387 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7cfc658b9-4t94r_d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5/kube-rbac-proxy/0.log" Oct 05 21:28:45 crc kubenswrapper[4753]: I1005 21:28:45.009291 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm_28d42154-af7b-440b-af1b-2ef50ee9edca/manager/0.log" Oct 05 21:28:45 crc kubenswrapper[4753]: I1005 21:28:45.199239 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-677d5bb784-5l45z_22ffe795-4cc5-4c86-9ae6-04999586c7de/kube-rbac-proxy/0.log" Oct 05 21:28:45 crc kubenswrapper[4753]: I1005 21:28:45.409347 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-677d5bb784-5l45z_22ffe795-4cc5-4c86-9ae6-04999586c7de/operator/0.log" Oct 05 21:28:45 crc kubenswrapper[4753]: I1005 21:28:45.500502 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-jswp7_a7f848da-2bdd-4f76-bc85-94cbb95bd680/registry-server/0.log" Oct 05 21:28:45 crc kubenswrapper[4753]: I1005 21:28:45.630205 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-c968bb45-xsjn9_8ed6d37f-576a-4f14-a98a-65193559d7de/kube-rbac-proxy/0.log" Oct 05 21:28:45 crc kubenswrapper[4753]: I1005 21:28:45.792720 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-c968bb45-xsjn9_8ed6d37f-576a-4f14-a98a-65193559d7de/manager/0.log" Oct 05 21:28:45 crc kubenswrapper[4753]: I1005 21:28:45.815867 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-66f6d6849b-9dsbr_ff1a796b-8cc7-4c73-842f-7b4a1170b56f/kube-rbac-proxy/0.log" Oct 05 21:28:46 crc kubenswrapper[4753]: I1005 21:28:46.142050 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7cfc658b9-4t94r_d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5/manager/0.log" Oct 05 21:28:46 crc kubenswrapper[4753]: I1005 21:28:46.210385 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-66f6d6849b-9dsbr_ff1a796b-8cc7-4c73-842f-7b4a1170b56f/manager/0.log" Oct 05 21:28:46 crc kubenswrapper[4753]: I1005 21:28:46.339718 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-4zhzq_995eda80-87fa-4160-b04e-679668f8d910/kube-rbac-proxy/0.log" Oct 05 21:28:46 crc kubenswrapper[4753]: I1005 21:28:46.387303 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr_2d0279fb-be4d-47a0-83c7-4452c7b13a5b/operator/0.log" Oct 05 21:28:46 crc kubenswrapper[4753]: I1005 21:28:46.414330 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-4zhzq_995eda80-87fa-4160-b04e-679668f8d910/manager/0.log" Oct 05 21:28:46 crc kubenswrapper[4753]: I1005 21:28:46.568058 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-f589c7597-58qhn_82db7b73-2afb-4063-9d64-fc3fa5559e93/kube-rbac-proxy/0.log" Oct 05 21:28:46 crc kubenswrapper[4753]: I1005 21:28:46.639206 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-f589c7597-58qhn_82db7b73-2afb-4063-9d64-fc3fa5559e93/manager/0.log" Oct 05 21:28:46 crc kubenswrapper[4753]: I1005 21:28:46.663511 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb6dcddc-zqgbl_8dd994e1-cb87-48dc-b844-2bdbc8b6e48d/kube-rbac-proxy/0.log" Oct 05 21:28:46 crc kubenswrapper[4753]: I1005 21:28:46.770421 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb6dcddc-zqgbl_8dd994e1-cb87-48dc-b844-2bdbc8b6e48d/manager/0.log" Oct 05 21:28:46 crc kubenswrapper[4753]: I1005 21:28:46.854758 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5d98cc5575-t99zs_96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3/kube-rbac-proxy/0.log" Oct 05 21:28:46 crc kubenswrapper[4753]: I1005 21:28:46.923100 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5d98cc5575-t99zs_96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3/manager/0.log" Oct 05 21:28:49 crc kubenswrapper[4753]: I1005 21:28:49.852545 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:28:49 crc kubenswrapper[4753]: E1005 21:28:49.854167 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:29:04 crc kubenswrapper[4753]: I1005 21:29:04.853886 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:29:04 crc kubenswrapper[4753]: E1005 21:29:04.855523 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:29:06 crc kubenswrapper[4753]: I1005 21:29:06.504595 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-h7rzp_7da48090-042e-4fef-afdf-9e6e54a89fe2/control-plane-machine-set-operator/0.log" Oct 05 21:29:07 crc kubenswrapper[4753]: I1005 21:29:07.048358 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-d8b6f_22db54ee-7d52-475e-a824-9e563b2920e8/machine-api-operator/0.log" Oct 05 21:29:07 crc kubenswrapper[4753]: I1005 21:29:07.053767 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-d8b6f_22db54ee-7d52-475e-a824-9e563b2920e8/kube-rbac-proxy/0.log" Oct 05 21:29:19 crc kubenswrapper[4753]: I1005 21:29:19.416253 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-54vmd_b8aa872e-b15b-458f-8bf4-0057a25d5d43/cert-manager-controller/0.log" Oct 05 21:29:19 crc kubenswrapper[4753]: I1005 21:29:19.501835 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-qvslf_3749909c-e0a6-4a84-8e16-e8d104f8bb29/cert-manager-cainjector/0.log" Oct 05 21:29:19 crc kubenswrapper[4753]: I1005 21:29:19.613812 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-shkg2_183d1891-ba1d-4ce0-83bd-9a547d099416/cert-manager-webhook/0.log" Oct 05 21:29:19 crc kubenswrapper[4753]: I1005 21:29:19.852552 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:29:19 crc kubenswrapper[4753]: E1005 21:29:19.852799 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:29:32 crc kubenswrapper[4753]: I1005 21:29:32.113708 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-md9qn_51e12003-95ee-4af9-a340-5928bb9d7ae7/nmstate-console-plugin/0.log" Oct 05 21:29:32 crc kubenswrapper[4753]: I1005 21:29:32.328293 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5mhrc_f5c8b381-2e86-47f5-86da-86db3c2aa511/nmstate-handler/0.log" Oct 05 21:29:32 crc kubenswrapper[4753]: I1005 21:29:32.426892 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-q2rfx_d825ed58-9313-4cf2-a923-53e6d809fa60/nmstate-metrics/0.log" Oct 05 21:29:32 crc kubenswrapper[4753]: I1005 21:29:32.476593 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-q2rfx_d825ed58-9313-4cf2-a923-53e6d809fa60/kube-rbac-proxy/0.log" Oct 05 21:29:32 crc kubenswrapper[4753]: I1005 21:29:32.577677 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-l87q6_ae5b861a-0212-4c4c-944b-7d6f3187a5a8/nmstate-operator/0.log" Oct 05 21:29:32 crc kubenswrapper[4753]: I1005 21:29:32.678665 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-k58rx_1b16b6bd-7d95-49c5-a2a8-87b0018e30c7/nmstate-webhook/0.log" Oct 05 21:29:33 crc kubenswrapper[4753]: I1005 21:29:33.852133 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:29:33 crc kubenswrapper[4753]: E1005 21:29:33.852407 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:29:46 crc kubenswrapper[4753]: I1005 21:29:46.851955 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:29:46 crc kubenswrapper[4753]: E1005 21:29:46.852725 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:29:47 crc kubenswrapper[4753]: I1005 21:29:47.394826 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-bbs9p_e8a9ee2c-3d45-4169-b558-9c27e68cc25f/kube-rbac-proxy/0.log" Oct 05 21:29:47 crc kubenswrapper[4753]: I1005 21:29:47.511335 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-bbs9p_e8a9ee2c-3d45-4169-b558-9c27e68cc25f/controller/0.log" Oct 05 21:29:47 crc kubenswrapper[4753]: I1005 21:29:47.621217 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-frr-files/0.log" Oct 05 21:29:47 crc kubenswrapper[4753]: I1005 21:29:47.779133 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-frr-files/0.log" Oct 05 21:29:47 crc kubenswrapper[4753]: I1005 21:29:47.789280 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-metrics/0.log" Oct 05 21:29:47 crc kubenswrapper[4753]: I1005 21:29:47.815022 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-reloader/0.log" Oct 05 21:29:47 crc kubenswrapper[4753]: I1005 21:29:47.840615 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-reloader/0.log" Oct 05 21:29:48 crc kubenswrapper[4753]: I1005 21:29:48.033752 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-frr-files/0.log" Oct 05 21:29:48 crc kubenswrapper[4753]: I1005 21:29:48.064182 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-reloader/0.log" Oct 05 21:29:48 crc kubenswrapper[4753]: I1005 21:29:48.107851 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-metrics/0.log" Oct 05 21:29:48 crc kubenswrapper[4753]: I1005 21:29:48.146209 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-metrics/0.log" Oct 05 21:29:48 crc kubenswrapper[4753]: I1005 21:29:48.252898 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-reloader/0.log" Oct 05 21:29:48 crc kubenswrapper[4753]: I1005 21:29:48.254581 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-frr-files/0.log" Oct 05 21:29:48 crc kubenswrapper[4753]: I1005 21:29:48.316745 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-metrics/0.log" Oct 05 21:29:48 crc kubenswrapper[4753]: I1005 21:29:48.341624 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/controller/0.log" Oct 05 21:29:48 crc kubenswrapper[4753]: I1005 21:29:48.454600 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/frr-metrics/0.log" Oct 05 21:29:48 crc kubenswrapper[4753]: I1005 21:29:48.532241 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/kube-rbac-proxy/0.log" Oct 05 21:29:48 crc kubenswrapper[4753]: I1005 21:29:48.656978 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/kube-rbac-proxy-frr/0.log" Oct 05 21:29:48 crc kubenswrapper[4753]: I1005 21:29:48.673025 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/reloader/0.log" Oct 05 21:29:48 crc kubenswrapper[4753]: I1005 21:29:48.952576 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-b4sqj_acdc2b18-6e82-4c5b-964f-b708c56c3704/frr-k8s-webhook-server/0.log" Oct 05 21:29:49 crc kubenswrapper[4753]: I1005 21:29:49.189243 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-78c6d655f5-9pcgt_2849418e-7428-46b6-89b0-fc001cb09db2/webhook-server/0.log" Oct 05 21:29:49 crc kubenswrapper[4753]: I1005 21:29:49.193599 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-76575689f9-tr955_e9003581-3277-433c-9c49-5a186f493cc5/manager/0.log" Oct 05 21:29:49 crc kubenswrapper[4753]: I1005 21:29:49.526159 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z5l8d_bb160ebf-df7f-4e27-b5f5-0e108e377e5d/kube-rbac-proxy/0.log" Oct 05 21:29:49 crc kubenswrapper[4753]: I1005 21:29:49.895910 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z5l8d_bb160ebf-df7f-4e27-b5f5-0e108e377e5d/speaker/0.log" Oct 05 21:29:49 crc kubenswrapper[4753]: I1005 21:29:49.910923 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/frr/0.log" Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.190663 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7"] Oct 05 21:30:00 crc kubenswrapper[4753]: E1005 21:30:00.193694 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb2fb36-2d32-4134-b83f-aa85b45d0fb4" containerName="container-00" Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.193815 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb2fb36-2d32-4134-b83f-aa85b45d0fb4" containerName="container-00" Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.194377 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eb2fb36-2d32-4134-b83f-aa85b45d0fb4" containerName="container-00" Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.198775 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.210589 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.210625 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.261810 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7"] Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.363941 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-secret-volume\") pod \"collect-profiles-29328330-4f2k7\" (UID: \"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.363993 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlw4g\" (UniqueName: \"kubernetes.io/projected/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-kube-api-access-dlw4g\") pod \"collect-profiles-29328330-4f2k7\" (UID: \"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.364632 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-config-volume\") pod \"collect-profiles-29328330-4f2k7\" (UID: \"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.467014 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-config-volume\") pod \"collect-profiles-29328330-4f2k7\" (UID: \"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.467320 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-secret-volume\") pod \"collect-profiles-29328330-4f2k7\" (UID: \"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.467453 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlw4g\" (UniqueName: \"kubernetes.io/projected/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-kube-api-access-dlw4g\") pod \"collect-profiles-29328330-4f2k7\" (UID: \"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.468791 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-config-volume\") pod \"collect-profiles-29328330-4f2k7\" (UID: \"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.480745 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-secret-volume\") pod \"collect-profiles-29328330-4f2k7\" (UID: \"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.485417 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlw4g\" (UniqueName: \"kubernetes.io/projected/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-kube-api-access-dlw4g\") pod \"collect-profiles-29328330-4f2k7\" (UID: \"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" Oct 05 21:30:00 crc kubenswrapper[4753]: I1005 21:30:00.525185 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" Oct 05 21:30:01 crc kubenswrapper[4753]: I1005 21:30:01.436594 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7"] Oct 05 21:30:01 crc kubenswrapper[4753]: W1005 21:30:01.439996 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfd487d8_29a5_4c78_a8dc_dabd08b5fe6c.slice/crio-d8bf50deceae62acef0abfa652ca334e571bc8a84cda1b13b28ee0961ef0363a WatchSource:0}: Error finding container d8bf50deceae62acef0abfa652ca334e571bc8a84cda1b13b28ee0961ef0363a: Status 404 returned error can't find the container with id d8bf50deceae62acef0abfa652ca334e571bc8a84cda1b13b28ee0961ef0363a Oct 05 21:30:01 crc kubenswrapper[4753]: I1005 21:30:01.693618 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" event={"ID":"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c","Type":"ContainerStarted","Data":"b48cb1cdc557c37abc27a558ea099f2a108b1efbc7da5617d1029298aa98c0dd"} Oct 05 21:30:01 crc kubenswrapper[4753]: I1005 21:30:01.695472 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" event={"ID":"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c","Type":"ContainerStarted","Data":"d8bf50deceae62acef0abfa652ca334e571bc8a84cda1b13b28ee0961ef0363a"} Oct 05 21:30:01 crc kubenswrapper[4753]: I1005 21:30:01.715635 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" podStartSLOduration=1.714578843 podStartE2EDuration="1.714578843s" podCreationTimestamp="2025-10-05 21:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 21:30:01.708247068 +0000 UTC m=+4510.556575300" watchObservedRunningTime="2025-10-05 21:30:01.714578843 +0000 UTC m=+4510.562907155" Oct 05 21:30:01 crc kubenswrapper[4753]: I1005 21:30:01.856975 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:30:01 crc kubenswrapper[4753]: E1005 21:30:01.857441 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:30:02 crc kubenswrapper[4753]: I1005 21:30:02.701829 4753 generic.go:334] "Generic (PLEG): container finished" podID="bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c" containerID="b48cb1cdc557c37abc27a558ea099f2a108b1efbc7da5617d1029298aa98c0dd" exitCode=0 Oct 05 21:30:02 crc kubenswrapper[4753]: I1005 21:30:02.701894 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" event={"ID":"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c","Type":"ContainerDied","Data":"b48cb1cdc557c37abc27a558ea099f2a108b1efbc7da5617d1029298aa98c0dd"} Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.051354 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88_e668534c-ad83-4c7d-9270-bffa57782d91/util/0.log" Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.096902 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.244767 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-config-volume\") pod \"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c\" (UID: \"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c\") " Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.244922 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlw4g\" (UniqueName: \"kubernetes.io/projected/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-kube-api-access-dlw4g\") pod \"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c\" (UID: \"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c\") " Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.244971 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-secret-volume\") pod \"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c\" (UID: \"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c\") " Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.245475 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-config-volume" (OuterVolumeSpecName: "config-volume") pod "bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c" (UID: "bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.245803 4753 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.264477 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c" (UID: "bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.283380 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-kube-api-access-dlw4g" (OuterVolumeSpecName: "kube-api-access-dlw4g") pod "bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c" (UID: "bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c"). InnerVolumeSpecName "kube-api-access-dlw4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.336196 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88_e668534c-ad83-4c7d-9270-bffa57782d91/util/0.log" Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.347997 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlw4g\" (UniqueName: \"kubernetes.io/projected/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-kube-api-access-dlw4g\") on node \"crc\" DevicePath \"\"" Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.348188 4753 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.415516 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88_e668534c-ad83-4c7d-9270-bffa57782d91/pull/0.log" Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.443259 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88_e668534c-ad83-4c7d-9270-bffa57782d91/pull/0.log" Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.501761 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46"] Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.509171 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328285-56g46"] Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.717322 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" event={"ID":"bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c","Type":"ContainerDied","Data":"d8bf50deceae62acef0abfa652ca334e571bc8a84cda1b13b28ee0961ef0363a"} Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.717357 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8bf50deceae62acef0abfa652ca334e571bc8a84cda1b13b28ee0961ef0363a" Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.717397 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328330-4f2k7" Oct 05 21:30:04 crc kubenswrapper[4753]: I1005 21:30:04.985708 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88_e668534c-ad83-4c7d-9270-bffa57782d91/util/0.log" Oct 05 21:30:05 crc kubenswrapper[4753]: I1005 21:30:05.113857 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88_e668534c-ad83-4c7d-9270-bffa57782d91/pull/0.log" Oct 05 21:30:05 crc kubenswrapper[4753]: I1005 21:30:05.187414 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88_e668534c-ad83-4c7d-9270-bffa57782d91/extract/0.log" Oct 05 21:30:05 crc kubenswrapper[4753]: I1005 21:30:05.244043 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gdpm6_a4bd315d-8dd0-4844-a675-ddc48827669d/extract-utilities/0.log" Oct 05 21:30:05 crc kubenswrapper[4753]: I1005 21:30:05.391581 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gdpm6_a4bd315d-8dd0-4844-a675-ddc48827669d/extract-content/0.log" Oct 05 21:30:05 crc kubenswrapper[4753]: I1005 21:30:05.419328 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gdpm6_a4bd315d-8dd0-4844-a675-ddc48827669d/extract-content/0.log" Oct 05 21:30:05 crc kubenswrapper[4753]: I1005 21:30:05.476608 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gdpm6_a4bd315d-8dd0-4844-a675-ddc48827669d/extract-utilities/0.log" Oct 05 21:30:05 crc kubenswrapper[4753]: I1005 21:30:05.603401 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gdpm6_a4bd315d-8dd0-4844-a675-ddc48827669d/extract-content/0.log" Oct 05 21:30:05 crc kubenswrapper[4753]: I1005 21:30:05.668939 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gdpm6_a4bd315d-8dd0-4844-a675-ddc48827669d/extract-utilities/0.log" Oct 05 21:30:05 crc kubenswrapper[4753]: I1005 21:30:05.863583 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f0158d5-f700-4d7c-a6e9-54f55bfc830c" path="/var/lib/kubelet/pods/2f0158d5-f700-4d7c-a6e9-54f55bfc830c/volumes" Oct 05 21:30:05 crc kubenswrapper[4753]: I1005 21:30:05.906175 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jq98s_09240459-8b63-4037-b2d7-0f3a2e294835/extract-utilities/0.log" Oct 05 21:30:06 crc kubenswrapper[4753]: I1005 21:30:06.095177 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gdpm6_a4bd315d-8dd0-4844-a675-ddc48827669d/registry-server/0.log" Oct 05 21:30:06 crc kubenswrapper[4753]: I1005 21:30:06.179058 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jq98s_09240459-8b63-4037-b2d7-0f3a2e294835/extract-utilities/0.log" Oct 05 21:30:06 crc kubenswrapper[4753]: I1005 21:30:06.190991 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jq98s_09240459-8b63-4037-b2d7-0f3a2e294835/extract-content/0.log" Oct 05 21:30:06 crc kubenswrapper[4753]: I1005 21:30:06.209691 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jq98s_09240459-8b63-4037-b2d7-0f3a2e294835/extract-content/0.log" Oct 05 21:30:06 crc kubenswrapper[4753]: I1005 21:30:06.340315 4753 scope.go:117] "RemoveContainer" containerID="48acf24bf446a951dd685758b111a6f4ae68ab851aea2e81b075aebb20a3ea6e" Oct 05 21:30:06 crc kubenswrapper[4753]: I1005 21:30:06.419540 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jq98s_09240459-8b63-4037-b2d7-0f3a2e294835/extract-utilities/0.log" Oct 05 21:30:06 crc kubenswrapper[4753]: I1005 21:30:06.465356 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jq98s_09240459-8b63-4037-b2d7-0f3a2e294835/extract-content/0.log" Oct 05 21:30:06 crc kubenswrapper[4753]: I1005 21:30:06.959102 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jq98s_09240459-8b63-4037-b2d7-0f3a2e294835/registry-server/0.log" Oct 05 21:30:07 crc kubenswrapper[4753]: I1005 21:30:07.189093 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892_2c210d16-5774-42c1-95d1-bba3c106fa44/util/0.log" Oct 05 21:30:07 crc kubenswrapper[4753]: I1005 21:30:07.269531 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892_2c210d16-5774-42c1-95d1-bba3c106fa44/util/0.log" Oct 05 21:30:07 crc kubenswrapper[4753]: I1005 21:30:07.320525 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892_2c210d16-5774-42c1-95d1-bba3c106fa44/pull/0.log" Oct 05 21:30:07 crc kubenswrapper[4753]: I1005 21:30:07.323861 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892_2c210d16-5774-42c1-95d1-bba3c106fa44/pull/0.log" Oct 05 21:30:07 crc kubenswrapper[4753]: I1005 21:30:07.533490 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892_2c210d16-5774-42c1-95d1-bba3c106fa44/util/0.log" Oct 05 21:30:07 crc kubenswrapper[4753]: I1005 21:30:07.543237 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892_2c210d16-5774-42c1-95d1-bba3c106fa44/pull/0.log" Oct 05 21:30:07 crc kubenswrapper[4753]: I1005 21:30:07.587930 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892_2c210d16-5774-42c1-95d1-bba3c106fa44/extract/0.log" Oct 05 21:30:07 crc kubenswrapper[4753]: I1005 21:30:07.629584 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-98zw7_1c9573cf-5cd6-4c3b-8c62-0766f942629a/marketplace-operator/0.log" Oct 05 21:30:07 crc kubenswrapper[4753]: I1005 21:30:07.759283 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qnn2p_6b25c928-06b3-4bea-91cd-f1c609d1e785/extract-utilities/0.log" Oct 05 21:30:07 crc kubenswrapper[4753]: I1005 21:30:07.942883 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qnn2p_6b25c928-06b3-4bea-91cd-f1c609d1e785/extract-utilities/0.log" Oct 05 21:30:07 crc kubenswrapper[4753]: I1005 21:30:07.956022 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qnn2p_6b25c928-06b3-4bea-91cd-f1c609d1e785/extract-content/0.log" Oct 05 21:30:07 crc kubenswrapper[4753]: I1005 21:30:07.971231 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qnn2p_6b25c928-06b3-4bea-91cd-f1c609d1e785/extract-content/0.log" Oct 05 21:30:08 crc kubenswrapper[4753]: I1005 21:30:08.120678 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qnn2p_6b25c928-06b3-4bea-91cd-f1c609d1e785/extract-content/0.log" Oct 05 21:30:08 crc kubenswrapper[4753]: I1005 21:30:08.149442 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qnn2p_6b25c928-06b3-4bea-91cd-f1c609d1e785/extract-utilities/0.log" Oct 05 21:30:08 crc kubenswrapper[4753]: I1005 21:30:08.179433 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nhg75_7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8/extract-utilities/0.log" Oct 05 21:30:08 crc kubenswrapper[4753]: I1005 21:30:08.286631 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qnn2p_6b25c928-06b3-4bea-91cd-f1c609d1e785/registry-server/0.log" Oct 05 21:30:08 crc kubenswrapper[4753]: I1005 21:30:08.414095 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nhg75_7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8/extract-content/0.log" Oct 05 21:30:08 crc kubenswrapper[4753]: I1005 21:30:08.436264 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nhg75_7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8/extract-content/0.log" Oct 05 21:30:08 crc kubenswrapper[4753]: I1005 21:30:08.438283 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nhg75_7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8/extract-utilities/0.log" Oct 05 21:30:08 crc kubenswrapper[4753]: I1005 21:30:08.575075 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nhg75_7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8/extract-utilities/0.log" Oct 05 21:30:08 crc kubenswrapper[4753]: I1005 21:30:08.582019 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nhg75_7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8/extract-content/0.log" Oct 05 21:30:08 crc kubenswrapper[4753]: I1005 21:30:08.957735 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nhg75_7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8/registry-server/0.log" Oct 05 21:30:15 crc kubenswrapper[4753]: I1005 21:30:15.852273 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:30:15 crc kubenswrapper[4753]: E1005 21:30:15.854161 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:30:30 crc kubenswrapper[4753]: I1005 21:30:30.852391 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:30:30 crc kubenswrapper[4753]: E1005 21:30:30.854130 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:30:41 crc kubenswrapper[4753]: I1005 21:30:41.863981 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:30:41 crc kubenswrapper[4753]: E1005 21:30:41.864802 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:30:56 crc kubenswrapper[4753]: I1005 21:30:56.852394 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:30:56 crc kubenswrapper[4753]: E1005 21:30:56.853203 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:31:08 crc kubenswrapper[4753]: I1005 21:31:08.854656 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:31:08 crc kubenswrapper[4753]: E1005 21:31:08.857024 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:31:19 crc kubenswrapper[4753]: I1005 21:31:19.851884 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:31:19 crc kubenswrapper[4753]: E1005 21:31:19.852695 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:31:24 crc kubenswrapper[4753]: I1005 21:31:24.951345 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w7kcf"] Oct 05 21:31:24 crc kubenswrapper[4753]: E1005 21:31:24.952870 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c" containerName="collect-profiles" Oct 05 21:31:24 crc kubenswrapper[4753]: I1005 21:31:24.952893 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c" containerName="collect-profiles" Oct 05 21:31:24 crc kubenswrapper[4753]: I1005 21:31:24.953256 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd487d8-29a5-4c78-a8dc-dabd08b5fe6c" containerName="collect-profiles" Oct 05 21:31:24 crc kubenswrapper[4753]: I1005 21:31:24.956285 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:24 crc kubenswrapper[4753]: I1005 21:31:24.971004 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w7kcf"] Oct 05 21:31:25 crc kubenswrapper[4753]: I1005 21:31:25.087157 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f52e130-3b76-477f-b8bb-d8ca3e06a653-utilities\") pod \"community-operators-w7kcf\" (UID: \"2f52e130-3b76-477f-b8bb-d8ca3e06a653\") " pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:25 crc kubenswrapper[4753]: I1005 21:31:25.087248 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f52e130-3b76-477f-b8bb-d8ca3e06a653-catalog-content\") pod \"community-operators-w7kcf\" (UID: \"2f52e130-3b76-477f-b8bb-d8ca3e06a653\") " pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:25 crc kubenswrapper[4753]: I1005 21:31:25.087419 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq2ck\" (UniqueName: \"kubernetes.io/projected/2f52e130-3b76-477f-b8bb-d8ca3e06a653-kube-api-access-kq2ck\") pod \"community-operators-w7kcf\" (UID: \"2f52e130-3b76-477f-b8bb-d8ca3e06a653\") " pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:25 crc kubenswrapper[4753]: I1005 21:31:25.190488 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f52e130-3b76-477f-b8bb-d8ca3e06a653-utilities\") pod \"community-operators-w7kcf\" (UID: \"2f52e130-3b76-477f-b8bb-d8ca3e06a653\") " pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:25 crc kubenswrapper[4753]: I1005 21:31:25.190614 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f52e130-3b76-477f-b8bb-d8ca3e06a653-catalog-content\") pod \"community-operators-w7kcf\" (UID: \"2f52e130-3b76-477f-b8bb-d8ca3e06a653\") " pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:25 crc kubenswrapper[4753]: I1005 21:31:25.190741 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq2ck\" (UniqueName: \"kubernetes.io/projected/2f52e130-3b76-477f-b8bb-d8ca3e06a653-kube-api-access-kq2ck\") pod \"community-operators-w7kcf\" (UID: \"2f52e130-3b76-477f-b8bb-d8ca3e06a653\") " pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:25 crc kubenswrapper[4753]: I1005 21:31:25.191653 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f52e130-3b76-477f-b8bb-d8ca3e06a653-catalog-content\") pod \"community-operators-w7kcf\" (UID: \"2f52e130-3b76-477f-b8bb-d8ca3e06a653\") " pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:25 crc kubenswrapper[4753]: I1005 21:31:25.191676 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f52e130-3b76-477f-b8bb-d8ca3e06a653-utilities\") pod \"community-operators-w7kcf\" (UID: \"2f52e130-3b76-477f-b8bb-d8ca3e06a653\") " pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:25 crc kubenswrapper[4753]: I1005 21:31:25.224680 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq2ck\" (UniqueName: \"kubernetes.io/projected/2f52e130-3b76-477f-b8bb-d8ca3e06a653-kube-api-access-kq2ck\") pod \"community-operators-w7kcf\" (UID: \"2f52e130-3b76-477f-b8bb-d8ca3e06a653\") " pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:25 crc kubenswrapper[4753]: I1005 21:31:25.281403 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:25 crc kubenswrapper[4753]: I1005 21:31:25.826498 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w7kcf"] Oct 05 21:31:26 crc kubenswrapper[4753]: I1005 21:31:26.514437 4753 generic.go:334] "Generic (PLEG): container finished" podID="2f52e130-3b76-477f-b8bb-d8ca3e06a653" containerID="5c6772cd7ae4caf6ed14aaa70b75bdf5738ac34e47c169412e6daea5d7154a8b" exitCode=0 Oct 05 21:31:26 crc kubenswrapper[4753]: I1005 21:31:26.514619 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7kcf" event={"ID":"2f52e130-3b76-477f-b8bb-d8ca3e06a653","Type":"ContainerDied","Data":"5c6772cd7ae4caf6ed14aaa70b75bdf5738ac34e47c169412e6daea5d7154a8b"} Oct 05 21:31:26 crc kubenswrapper[4753]: I1005 21:31:26.514731 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7kcf" event={"ID":"2f52e130-3b76-477f-b8bb-d8ca3e06a653","Type":"ContainerStarted","Data":"b569220caa448eca780aca86ea7060bb053343f6c0c0d7a87831d2423c614f32"} Oct 05 21:31:26 crc kubenswrapper[4753]: I1005 21:31:26.519717 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 05 21:31:27 crc kubenswrapper[4753]: I1005 21:31:27.528893 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7kcf" event={"ID":"2f52e130-3b76-477f-b8bb-d8ca3e06a653","Type":"ContainerStarted","Data":"c3e99978595de089e8579b22ff2925b64e899bf0f572e8aace838429020e30e2"} Oct 05 21:31:28 crc kubenswrapper[4753]: I1005 21:31:28.542863 4753 generic.go:334] "Generic (PLEG): container finished" podID="2f52e130-3b76-477f-b8bb-d8ca3e06a653" containerID="c3e99978595de089e8579b22ff2925b64e899bf0f572e8aace838429020e30e2" exitCode=0 Oct 05 21:31:28 crc kubenswrapper[4753]: I1005 21:31:28.542957 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7kcf" event={"ID":"2f52e130-3b76-477f-b8bb-d8ca3e06a653","Type":"ContainerDied","Data":"c3e99978595de089e8579b22ff2925b64e899bf0f572e8aace838429020e30e2"} Oct 05 21:31:29 crc kubenswrapper[4753]: I1005 21:31:29.553364 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7kcf" event={"ID":"2f52e130-3b76-477f-b8bb-d8ca3e06a653","Type":"ContainerStarted","Data":"3a98df518250e341d42be5cb1b7695d8fa3f87a19c8788c45c2b442f58b0d173"} Oct 05 21:31:29 crc kubenswrapper[4753]: I1005 21:31:29.573442 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w7kcf" podStartSLOduration=3.061892828 podStartE2EDuration="5.57342417s" podCreationTimestamp="2025-10-05 21:31:24 +0000 UTC" firstStartedPulling="2025-10-05 21:31:26.517351376 +0000 UTC m=+4595.365679618" lastFinishedPulling="2025-10-05 21:31:29.028882718 +0000 UTC m=+4597.877210960" observedRunningTime="2025-10-05 21:31:29.570931542 +0000 UTC m=+4598.419259774" watchObservedRunningTime="2025-10-05 21:31:29.57342417 +0000 UTC m=+4598.421752402" Oct 05 21:31:33 crc kubenswrapper[4753]: I1005 21:31:33.852593 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:31:33 crc kubenswrapper[4753]: E1005 21:31:33.853683 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:31:35 crc kubenswrapper[4753]: I1005 21:31:35.282483 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:35 crc kubenswrapper[4753]: I1005 21:31:35.282552 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:35 crc kubenswrapper[4753]: I1005 21:31:35.355354 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:35 crc kubenswrapper[4753]: I1005 21:31:35.699220 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:35 crc kubenswrapper[4753]: I1005 21:31:35.777851 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w7kcf"] Oct 05 21:31:37 crc kubenswrapper[4753]: I1005 21:31:37.640540 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w7kcf" podUID="2f52e130-3b76-477f-b8bb-d8ca3e06a653" containerName="registry-server" containerID="cri-o://3a98df518250e341d42be5cb1b7695d8fa3f87a19c8788c45c2b442f58b0d173" gracePeriod=2 Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.126542 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.266910 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq2ck\" (UniqueName: \"kubernetes.io/projected/2f52e130-3b76-477f-b8bb-d8ca3e06a653-kube-api-access-kq2ck\") pod \"2f52e130-3b76-477f-b8bb-d8ca3e06a653\" (UID: \"2f52e130-3b76-477f-b8bb-d8ca3e06a653\") " Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.266989 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f52e130-3b76-477f-b8bb-d8ca3e06a653-catalog-content\") pod \"2f52e130-3b76-477f-b8bb-d8ca3e06a653\" (UID: \"2f52e130-3b76-477f-b8bb-d8ca3e06a653\") " Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.267094 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f52e130-3b76-477f-b8bb-d8ca3e06a653-utilities\") pod \"2f52e130-3b76-477f-b8bb-d8ca3e06a653\" (UID: \"2f52e130-3b76-477f-b8bb-d8ca3e06a653\") " Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.268210 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f52e130-3b76-477f-b8bb-d8ca3e06a653-utilities" (OuterVolumeSpecName: "utilities") pod "2f52e130-3b76-477f-b8bb-d8ca3e06a653" (UID: "2f52e130-3b76-477f-b8bb-d8ca3e06a653"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.280985 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f52e130-3b76-477f-b8bb-d8ca3e06a653-kube-api-access-kq2ck" (OuterVolumeSpecName: "kube-api-access-kq2ck") pod "2f52e130-3b76-477f-b8bb-d8ca3e06a653" (UID: "2f52e130-3b76-477f-b8bb-d8ca3e06a653"). InnerVolumeSpecName "kube-api-access-kq2ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.312548 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f52e130-3b76-477f-b8bb-d8ca3e06a653-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f52e130-3b76-477f-b8bb-d8ca3e06a653" (UID: "2f52e130-3b76-477f-b8bb-d8ca3e06a653"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.371983 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq2ck\" (UniqueName: \"kubernetes.io/projected/2f52e130-3b76-477f-b8bb-d8ca3e06a653-kube-api-access-kq2ck\") on node \"crc\" DevicePath \"\"" Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.372042 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f52e130-3b76-477f-b8bb-d8ca3e06a653-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.372062 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f52e130-3b76-477f-b8bb-d8ca3e06a653-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.656539 4753 generic.go:334] "Generic (PLEG): container finished" podID="2f52e130-3b76-477f-b8bb-d8ca3e06a653" containerID="3a98df518250e341d42be5cb1b7695d8fa3f87a19c8788c45c2b442f58b0d173" exitCode=0 Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.656576 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7kcf" event={"ID":"2f52e130-3b76-477f-b8bb-d8ca3e06a653","Type":"ContainerDied","Data":"3a98df518250e341d42be5cb1b7695d8fa3f87a19c8788c45c2b442f58b0d173"} Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.656599 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7kcf" event={"ID":"2f52e130-3b76-477f-b8bb-d8ca3e06a653","Type":"ContainerDied","Data":"b569220caa448eca780aca86ea7060bb053343f6c0c0d7a87831d2423c614f32"} Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.656615 4753 scope.go:117] "RemoveContainer" containerID="3a98df518250e341d42be5cb1b7695d8fa3f87a19c8788c45c2b442f58b0d173" Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.656662 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w7kcf" Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.708702 4753 scope.go:117] "RemoveContainer" containerID="c3e99978595de089e8579b22ff2925b64e899bf0f572e8aace838429020e30e2" Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.740733 4753 scope.go:117] "RemoveContainer" containerID="5c6772cd7ae4caf6ed14aaa70b75bdf5738ac34e47c169412e6daea5d7154a8b" Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.746334 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w7kcf"] Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.755016 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w7kcf"] Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.791573 4753 scope.go:117] "RemoveContainer" containerID="3a98df518250e341d42be5cb1b7695d8fa3f87a19c8788c45c2b442f58b0d173" Oct 05 21:31:38 crc kubenswrapper[4753]: E1005 21:31:38.792120 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a98df518250e341d42be5cb1b7695d8fa3f87a19c8788c45c2b442f58b0d173\": container with ID starting with 3a98df518250e341d42be5cb1b7695d8fa3f87a19c8788c45c2b442f58b0d173 not found: ID does not exist" containerID="3a98df518250e341d42be5cb1b7695d8fa3f87a19c8788c45c2b442f58b0d173" Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.792270 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a98df518250e341d42be5cb1b7695d8fa3f87a19c8788c45c2b442f58b0d173"} err="failed to get container status \"3a98df518250e341d42be5cb1b7695d8fa3f87a19c8788c45c2b442f58b0d173\": rpc error: code = NotFound desc = could not find container \"3a98df518250e341d42be5cb1b7695d8fa3f87a19c8788c45c2b442f58b0d173\": container with ID starting with 3a98df518250e341d42be5cb1b7695d8fa3f87a19c8788c45c2b442f58b0d173 not found: ID does not exist" Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.792308 4753 scope.go:117] "RemoveContainer" containerID="c3e99978595de089e8579b22ff2925b64e899bf0f572e8aace838429020e30e2" Oct 05 21:31:38 crc kubenswrapper[4753]: E1005 21:31:38.793824 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3e99978595de089e8579b22ff2925b64e899bf0f572e8aace838429020e30e2\": container with ID starting with c3e99978595de089e8579b22ff2925b64e899bf0f572e8aace838429020e30e2 not found: ID does not exist" containerID="c3e99978595de089e8579b22ff2925b64e899bf0f572e8aace838429020e30e2" Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.794002 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3e99978595de089e8579b22ff2925b64e899bf0f572e8aace838429020e30e2"} err="failed to get container status \"c3e99978595de089e8579b22ff2925b64e899bf0f572e8aace838429020e30e2\": rpc error: code = NotFound desc = could not find container \"c3e99978595de089e8579b22ff2925b64e899bf0f572e8aace838429020e30e2\": container with ID starting with c3e99978595de089e8579b22ff2925b64e899bf0f572e8aace838429020e30e2 not found: ID does not exist" Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.794184 4753 scope.go:117] "RemoveContainer" containerID="5c6772cd7ae4caf6ed14aaa70b75bdf5738ac34e47c169412e6daea5d7154a8b" Oct 05 21:31:38 crc kubenswrapper[4753]: E1005 21:31:38.794789 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c6772cd7ae4caf6ed14aaa70b75bdf5738ac34e47c169412e6daea5d7154a8b\": container with ID starting with 5c6772cd7ae4caf6ed14aaa70b75bdf5738ac34e47c169412e6daea5d7154a8b not found: ID does not exist" containerID="5c6772cd7ae4caf6ed14aaa70b75bdf5738ac34e47c169412e6daea5d7154a8b" Oct 05 21:31:38 crc kubenswrapper[4753]: I1005 21:31:38.794839 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6772cd7ae4caf6ed14aaa70b75bdf5738ac34e47c169412e6daea5d7154a8b"} err="failed to get container status \"5c6772cd7ae4caf6ed14aaa70b75bdf5738ac34e47c169412e6daea5d7154a8b\": rpc error: code = NotFound desc = could not find container \"5c6772cd7ae4caf6ed14aaa70b75bdf5738ac34e47c169412e6daea5d7154a8b\": container with ID starting with 5c6772cd7ae4caf6ed14aaa70b75bdf5738ac34e47c169412e6daea5d7154a8b not found: ID does not exist" Oct 05 21:31:39 crc kubenswrapper[4753]: I1005 21:31:39.873016 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f52e130-3b76-477f-b8bb-d8ca3e06a653" path="/var/lib/kubelet/pods/2f52e130-3b76-477f-b8bb-d8ca3e06a653/volumes" Oct 05 21:31:45 crc kubenswrapper[4753]: I1005 21:31:45.851983 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:31:45 crc kubenswrapper[4753]: E1005 21:31:45.853222 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:31:56 crc kubenswrapper[4753]: I1005 21:31:56.876432 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:31:56 crc kubenswrapper[4753]: E1005 21:31:56.877292 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.418383 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9gln9"] Oct 05 21:32:02 crc kubenswrapper[4753]: E1005 21:32:02.419425 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f52e130-3b76-477f-b8bb-d8ca3e06a653" containerName="extract-content" Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.419439 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f52e130-3b76-477f-b8bb-d8ca3e06a653" containerName="extract-content" Oct 05 21:32:02 crc kubenswrapper[4753]: E1005 21:32:02.419472 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f52e130-3b76-477f-b8bb-d8ca3e06a653" containerName="registry-server" Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.419480 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f52e130-3b76-477f-b8bb-d8ca3e06a653" containerName="registry-server" Oct 05 21:32:02 crc kubenswrapper[4753]: E1005 21:32:02.419522 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f52e130-3b76-477f-b8bb-d8ca3e06a653" containerName="extract-utilities" Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.419530 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f52e130-3b76-477f-b8bb-d8ca3e06a653" containerName="extract-utilities" Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.419761 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f52e130-3b76-477f-b8bb-d8ca3e06a653" containerName="registry-server" Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.428426 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.438733 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9gln9"] Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.465469 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-utilities\") pod \"redhat-operators-9gln9\" (UID: \"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206\") " pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.465544 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxw7x\" (UniqueName: \"kubernetes.io/projected/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-kube-api-access-zxw7x\") pod \"redhat-operators-9gln9\" (UID: \"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206\") " pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.465582 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-catalog-content\") pod \"redhat-operators-9gln9\" (UID: \"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206\") " pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.567566 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-utilities\") pod \"redhat-operators-9gln9\" (UID: \"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206\") " pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.568077 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxw7x\" (UniqueName: \"kubernetes.io/projected/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-kube-api-access-zxw7x\") pod \"redhat-operators-9gln9\" (UID: \"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206\") " pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.567983 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-utilities\") pod \"redhat-operators-9gln9\" (UID: \"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206\") " pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.568249 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-catalog-content\") pod \"redhat-operators-9gln9\" (UID: \"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206\") " pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.568509 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-catalog-content\") pod \"redhat-operators-9gln9\" (UID: \"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206\") " pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.585783 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxw7x\" (UniqueName: \"kubernetes.io/projected/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-kube-api-access-zxw7x\") pod \"redhat-operators-9gln9\" (UID: \"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206\") " pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:02 crc kubenswrapper[4753]: I1005 21:32:02.749171 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:03 crc kubenswrapper[4753]: I1005 21:32:03.225213 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9gln9"] Oct 05 21:32:03 crc kubenswrapper[4753]: I1005 21:32:03.979792 4753 generic.go:334] "Generic (PLEG): container finished" podID="c0aeb2d2-9f71-4665-99ba-4a02e4fc5206" containerID="940601bd9c452241d513d3bc261a4f47a95f736501b455acf5fb302cddac0888" exitCode=0 Oct 05 21:32:03 crc kubenswrapper[4753]: I1005 21:32:03.980050 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gln9" event={"ID":"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206","Type":"ContainerDied","Data":"940601bd9c452241d513d3bc261a4f47a95f736501b455acf5fb302cddac0888"} Oct 05 21:32:03 crc kubenswrapper[4753]: I1005 21:32:03.980172 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gln9" event={"ID":"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206","Type":"ContainerStarted","Data":"0668e4c5a3a021fd128e2433a9be087e0b2b3a54919135ec8487af9dfeb77fec"} Oct 05 21:32:04 crc kubenswrapper[4753]: I1005 21:32:04.990913 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gln9" event={"ID":"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206","Type":"ContainerStarted","Data":"24879060c23a49ffb1d640d47477f6150d5c1edab182196babd4f6bcffd17eae"} Oct 05 21:32:08 crc kubenswrapper[4753]: I1005 21:32:08.021627 4753 generic.go:334] "Generic (PLEG): container finished" podID="c0aeb2d2-9f71-4665-99ba-4a02e4fc5206" containerID="24879060c23a49ffb1d640d47477f6150d5c1edab182196babd4f6bcffd17eae" exitCode=0 Oct 05 21:32:08 crc kubenswrapper[4753]: I1005 21:32:08.021706 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gln9" event={"ID":"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206","Type":"ContainerDied","Data":"24879060c23a49ffb1d640d47477f6150d5c1edab182196babd4f6bcffd17eae"} Oct 05 21:32:09 crc kubenswrapper[4753]: I1005 21:32:09.031443 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gln9" event={"ID":"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206","Type":"ContainerStarted","Data":"5b62f2d6804009c30228f28acb24d093021c2f64bf894087e71f377ce358ca9e"} Oct 05 21:32:09 crc kubenswrapper[4753]: I1005 21:32:09.856084 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:32:09 crc kubenswrapper[4753]: E1005 21:32:09.856608 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:32:12 crc kubenswrapper[4753]: I1005 21:32:12.750438 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:12 crc kubenswrapper[4753]: I1005 21:32:12.752428 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:13 crc kubenswrapper[4753]: I1005 21:32:13.797022 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9gln9" podUID="c0aeb2d2-9f71-4665-99ba-4a02e4fc5206" containerName="registry-server" probeResult="failure" output=< Oct 05 21:32:13 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 21:32:13 crc kubenswrapper[4753]: > Oct 05 21:32:20 crc kubenswrapper[4753]: I1005 21:32:20.852095 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:32:20 crc kubenswrapper[4753]: E1005 21:32:20.852965 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:32:22 crc kubenswrapper[4753]: I1005 21:32:22.837317 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:22 crc kubenswrapper[4753]: I1005 21:32:22.868370 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9gln9" podStartSLOduration=16.141188744 podStartE2EDuration="20.868349807s" podCreationTimestamp="2025-10-05 21:32:02 +0000 UTC" firstStartedPulling="2025-10-05 21:32:03.982794479 +0000 UTC m=+4632.831122741" lastFinishedPulling="2025-10-05 21:32:08.709955572 +0000 UTC m=+4637.558283804" observedRunningTime="2025-10-05 21:32:09.051200562 +0000 UTC m=+4637.899528794" watchObservedRunningTime="2025-10-05 21:32:22.868349807 +0000 UTC m=+4651.716678049" Oct 05 21:32:22 crc kubenswrapper[4753]: I1005 21:32:22.936231 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:23 crc kubenswrapper[4753]: I1005 21:32:23.096159 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9gln9"] Oct 05 21:32:24 crc kubenswrapper[4753]: I1005 21:32:24.182981 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9gln9" podUID="c0aeb2d2-9f71-4665-99ba-4a02e4fc5206" containerName="registry-server" containerID="cri-o://5b62f2d6804009c30228f28acb24d093021c2f64bf894087e71f377ce358ca9e" gracePeriod=2 Oct 05 21:32:24 crc kubenswrapper[4753]: I1005 21:32:24.647227 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:24 crc kubenswrapper[4753]: I1005 21:32:24.769287 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-catalog-content\") pod \"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206\" (UID: \"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206\") " Oct 05 21:32:24 crc kubenswrapper[4753]: I1005 21:32:24.769500 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxw7x\" (UniqueName: \"kubernetes.io/projected/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-kube-api-access-zxw7x\") pod \"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206\" (UID: \"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206\") " Oct 05 21:32:24 crc kubenswrapper[4753]: I1005 21:32:24.769586 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-utilities\") pod \"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206\" (UID: \"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206\") " Oct 05 21:32:24 crc kubenswrapper[4753]: I1005 21:32:24.772227 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-utilities" (OuterVolumeSpecName: "utilities") pod "c0aeb2d2-9f71-4665-99ba-4a02e4fc5206" (UID: "c0aeb2d2-9f71-4665-99ba-4a02e4fc5206"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:32:24 crc kubenswrapper[4753]: I1005 21:32:24.789519 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-kube-api-access-zxw7x" (OuterVolumeSpecName: "kube-api-access-zxw7x") pod "c0aeb2d2-9f71-4665-99ba-4a02e4fc5206" (UID: "c0aeb2d2-9f71-4665-99ba-4a02e4fc5206"). InnerVolumeSpecName "kube-api-access-zxw7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:32:24 crc kubenswrapper[4753]: I1005 21:32:24.857800 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0aeb2d2-9f71-4665-99ba-4a02e4fc5206" (UID: "c0aeb2d2-9f71-4665-99ba-4a02e4fc5206"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:32:24 crc kubenswrapper[4753]: I1005 21:32:24.873066 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 21:32:24 crc kubenswrapper[4753]: I1005 21:32:24.873108 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxw7x\" (UniqueName: \"kubernetes.io/projected/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-kube-api-access-zxw7x\") on node \"crc\" DevicePath \"\"" Oct 05 21:32:24 crc kubenswrapper[4753]: I1005 21:32:24.873125 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 21:32:25 crc kubenswrapper[4753]: I1005 21:32:25.196975 4753 generic.go:334] "Generic (PLEG): container finished" podID="c0aeb2d2-9f71-4665-99ba-4a02e4fc5206" containerID="5b62f2d6804009c30228f28acb24d093021c2f64bf894087e71f377ce358ca9e" exitCode=0 Oct 05 21:32:25 crc kubenswrapper[4753]: I1005 21:32:25.197032 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gln9" event={"ID":"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206","Type":"ContainerDied","Data":"5b62f2d6804009c30228f28acb24d093021c2f64bf894087e71f377ce358ca9e"} Oct 05 21:32:25 crc kubenswrapper[4753]: I1005 21:32:25.197421 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gln9" event={"ID":"c0aeb2d2-9f71-4665-99ba-4a02e4fc5206","Type":"ContainerDied","Data":"0668e4c5a3a021fd128e2433a9be087e0b2b3a54919135ec8487af9dfeb77fec"} Oct 05 21:32:25 crc kubenswrapper[4753]: I1005 21:32:25.197447 4753 scope.go:117] "RemoveContainer" containerID="5b62f2d6804009c30228f28acb24d093021c2f64bf894087e71f377ce358ca9e" Oct 05 21:32:25 crc kubenswrapper[4753]: I1005 21:32:25.197088 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gln9" Oct 05 21:32:25 crc kubenswrapper[4753]: I1005 21:32:25.232553 4753 scope.go:117] "RemoveContainer" containerID="24879060c23a49ffb1d640d47477f6150d5c1edab182196babd4f6bcffd17eae" Oct 05 21:32:25 crc kubenswrapper[4753]: I1005 21:32:25.235410 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9gln9"] Oct 05 21:32:25 crc kubenswrapper[4753]: I1005 21:32:25.242875 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9gln9"] Oct 05 21:32:25 crc kubenswrapper[4753]: I1005 21:32:25.255550 4753 scope.go:117] "RemoveContainer" containerID="940601bd9c452241d513d3bc261a4f47a95f736501b455acf5fb302cddac0888" Oct 05 21:32:25 crc kubenswrapper[4753]: I1005 21:32:25.329175 4753 scope.go:117] "RemoveContainer" containerID="5b62f2d6804009c30228f28acb24d093021c2f64bf894087e71f377ce358ca9e" Oct 05 21:32:25 crc kubenswrapper[4753]: E1005 21:32:25.329828 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b62f2d6804009c30228f28acb24d093021c2f64bf894087e71f377ce358ca9e\": container with ID starting with 5b62f2d6804009c30228f28acb24d093021c2f64bf894087e71f377ce358ca9e not found: ID does not exist" containerID="5b62f2d6804009c30228f28acb24d093021c2f64bf894087e71f377ce358ca9e" Oct 05 21:32:25 crc kubenswrapper[4753]: I1005 21:32:25.329873 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b62f2d6804009c30228f28acb24d093021c2f64bf894087e71f377ce358ca9e"} err="failed to get container status \"5b62f2d6804009c30228f28acb24d093021c2f64bf894087e71f377ce358ca9e\": rpc error: code = NotFound desc = could not find container \"5b62f2d6804009c30228f28acb24d093021c2f64bf894087e71f377ce358ca9e\": container with ID starting with 5b62f2d6804009c30228f28acb24d093021c2f64bf894087e71f377ce358ca9e not found: ID does not exist" Oct 05 21:32:25 crc kubenswrapper[4753]: I1005 21:32:25.329914 4753 scope.go:117] "RemoveContainer" containerID="24879060c23a49ffb1d640d47477f6150d5c1edab182196babd4f6bcffd17eae" Oct 05 21:32:25 crc kubenswrapper[4753]: E1005 21:32:25.330742 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24879060c23a49ffb1d640d47477f6150d5c1edab182196babd4f6bcffd17eae\": container with ID starting with 24879060c23a49ffb1d640d47477f6150d5c1edab182196babd4f6bcffd17eae not found: ID does not exist" containerID="24879060c23a49ffb1d640d47477f6150d5c1edab182196babd4f6bcffd17eae" Oct 05 21:32:25 crc kubenswrapper[4753]: I1005 21:32:25.330987 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24879060c23a49ffb1d640d47477f6150d5c1edab182196babd4f6bcffd17eae"} err="failed to get container status \"24879060c23a49ffb1d640d47477f6150d5c1edab182196babd4f6bcffd17eae\": rpc error: code = NotFound desc = could not find container \"24879060c23a49ffb1d640d47477f6150d5c1edab182196babd4f6bcffd17eae\": container with ID starting with 24879060c23a49ffb1d640d47477f6150d5c1edab182196babd4f6bcffd17eae not found: ID does not exist" Oct 05 21:32:25 crc kubenswrapper[4753]: I1005 21:32:25.331213 4753 scope.go:117] "RemoveContainer" containerID="940601bd9c452241d513d3bc261a4f47a95f736501b455acf5fb302cddac0888" Oct 05 21:32:25 crc kubenswrapper[4753]: E1005 21:32:25.333263 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"940601bd9c452241d513d3bc261a4f47a95f736501b455acf5fb302cddac0888\": container with ID starting with 940601bd9c452241d513d3bc261a4f47a95f736501b455acf5fb302cddac0888 not found: ID does not exist" containerID="940601bd9c452241d513d3bc261a4f47a95f736501b455acf5fb302cddac0888" Oct 05 21:32:25 crc kubenswrapper[4753]: I1005 21:32:25.333288 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"940601bd9c452241d513d3bc261a4f47a95f736501b455acf5fb302cddac0888"} err="failed to get container status \"940601bd9c452241d513d3bc261a4f47a95f736501b455acf5fb302cddac0888\": rpc error: code = NotFound desc = could not find container \"940601bd9c452241d513d3bc261a4f47a95f736501b455acf5fb302cddac0888\": container with ID starting with 940601bd9c452241d513d3bc261a4f47a95f736501b455acf5fb302cddac0888 not found: ID does not exist" Oct 05 21:32:25 crc kubenswrapper[4753]: I1005 21:32:25.863278 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0aeb2d2-9f71-4665-99ba-4a02e4fc5206" path="/var/lib/kubelet/pods/c0aeb2d2-9f71-4665-99ba-4a02e4fc5206/volumes" Oct 05 21:32:33 crc kubenswrapper[4753]: I1005 21:32:33.853305 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:32:33 crc kubenswrapper[4753]: E1005 21:32:33.854490 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:32:40 crc kubenswrapper[4753]: I1005 21:32:40.398286 4753 generic.go:334] "Generic (PLEG): container finished" podID="a8f3008a-d05c-4bff-8cab-025b62c2c216" containerID="8a071fed57c3098f869951e7c2b9303d68b8935b3f98ff6e40d0aa3a719fedb2" exitCode=0 Oct 05 21:32:40 crc kubenswrapper[4753]: I1005 21:32:40.398863 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-txk7h/must-gather-w7hs5" event={"ID":"a8f3008a-d05c-4bff-8cab-025b62c2c216","Type":"ContainerDied","Data":"8a071fed57c3098f869951e7c2b9303d68b8935b3f98ff6e40d0aa3a719fedb2"} Oct 05 21:32:40 crc kubenswrapper[4753]: I1005 21:32:40.399427 4753 scope.go:117] "RemoveContainer" containerID="8a071fed57c3098f869951e7c2b9303d68b8935b3f98ff6e40d0aa3a719fedb2" Oct 05 21:32:40 crc kubenswrapper[4753]: I1005 21:32:40.554713 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-txk7h_must-gather-w7hs5_a8f3008a-d05c-4bff-8cab-025b62c2c216/gather/0.log" Oct 05 21:32:44 crc kubenswrapper[4753]: I1005 21:32:44.852903 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:32:45 crc kubenswrapper[4753]: I1005 21:32:45.472308 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"5905e260f85412f537154f743daa6fab4914430b1ac727576dca9ac43782f6d3"} Oct 05 21:32:50 crc kubenswrapper[4753]: I1005 21:32:50.537210 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-txk7h/must-gather-w7hs5"] Oct 05 21:32:50 crc kubenswrapper[4753]: I1005 21:32:50.537921 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-txk7h/must-gather-w7hs5" podUID="a8f3008a-d05c-4bff-8cab-025b62c2c216" containerName="copy" containerID="cri-o://619841d507eb52f86dc132c695c90b11e4907330a7d59d58a386ec38adb934aa" gracePeriod=2 Oct 05 21:32:50 crc kubenswrapper[4753]: I1005 21:32:50.551622 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-txk7h/must-gather-w7hs5"] Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.050569 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-txk7h_must-gather-w7hs5_a8f3008a-d05c-4bff-8cab-025b62c2c216/copy/0.log" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.051504 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-txk7h/must-gather-w7hs5" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.054721 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a8f3008a-d05c-4bff-8cab-025b62c2c216-must-gather-output\") pod \"a8f3008a-d05c-4bff-8cab-025b62c2c216\" (UID: \"a8f3008a-d05c-4bff-8cab-025b62c2c216\") " Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.157860 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9qn5\" (UniqueName: \"kubernetes.io/projected/a8f3008a-d05c-4bff-8cab-025b62c2c216-kube-api-access-x9qn5\") pod \"a8f3008a-d05c-4bff-8cab-025b62c2c216\" (UID: \"a8f3008a-d05c-4bff-8cab-025b62c2c216\") " Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.182069 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8f3008a-d05c-4bff-8cab-025b62c2c216-kube-api-access-x9qn5" (OuterVolumeSpecName: "kube-api-access-x9qn5") pod "a8f3008a-d05c-4bff-8cab-025b62c2c216" (UID: "a8f3008a-d05c-4bff-8cab-025b62c2c216"). InnerVolumeSpecName "kube-api-access-x9qn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.252274 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8f3008a-d05c-4bff-8cab-025b62c2c216-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a8f3008a-d05c-4bff-8cab-025b62c2c216" (UID: "a8f3008a-d05c-4bff-8cab-025b62c2c216"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.261528 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9qn5\" (UniqueName: \"kubernetes.io/projected/a8f3008a-d05c-4bff-8cab-025b62c2c216-kube-api-access-x9qn5\") on node \"crc\" DevicePath \"\"" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.261552 4753 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a8f3008a-d05c-4bff-8cab-025b62c2c216-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.536473 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-txk7h_must-gather-w7hs5_a8f3008a-d05c-4bff-8cab-025b62c2c216/copy/0.log" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.537062 4753 generic.go:334] "Generic (PLEG): container finished" podID="a8f3008a-d05c-4bff-8cab-025b62c2c216" containerID="619841d507eb52f86dc132c695c90b11e4907330a7d59d58a386ec38adb934aa" exitCode=143 Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.537124 4753 scope.go:117] "RemoveContainer" containerID="619841d507eb52f86dc132c695c90b11e4907330a7d59d58a386ec38adb934aa" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.537276 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-txk7h/must-gather-w7hs5" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.578421 4753 scope.go:117] "RemoveContainer" containerID="8a071fed57c3098f869951e7c2b9303d68b8935b3f98ff6e40d0aa3a719fedb2" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.646896 4753 scope.go:117] "RemoveContainer" containerID="619841d507eb52f86dc132c695c90b11e4907330a7d59d58a386ec38adb934aa" Oct 05 21:32:51 crc kubenswrapper[4753]: E1005 21:32:51.647672 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"619841d507eb52f86dc132c695c90b11e4907330a7d59d58a386ec38adb934aa\": container with ID starting with 619841d507eb52f86dc132c695c90b11e4907330a7d59d58a386ec38adb934aa not found: ID does not exist" containerID="619841d507eb52f86dc132c695c90b11e4907330a7d59d58a386ec38adb934aa" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.647702 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"619841d507eb52f86dc132c695c90b11e4907330a7d59d58a386ec38adb934aa"} err="failed to get container status \"619841d507eb52f86dc132c695c90b11e4907330a7d59d58a386ec38adb934aa\": rpc error: code = NotFound desc = could not find container \"619841d507eb52f86dc132c695c90b11e4907330a7d59d58a386ec38adb934aa\": container with ID starting with 619841d507eb52f86dc132c695c90b11e4907330a7d59d58a386ec38adb934aa not found: ID does not exist" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.647724 4753 scope.go:117] "RemoveContainer" containerID="8a071fed57c3098f869951e7c2b9303d68b8935b3f98ff6e40d0aa3a719fedb2" Oct 05 21:32:51 crc kubenswrapper[4753]: E1005 21:32:51.648323 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a071fed57c3098f869951e7c2b9303d68b8935b3f98ff6e40d0aa3a719fedb2\": container with ID starting with 8a071fed57c3098f869951e7c2b9303d68b8935b3f98ff6e40d0aa3a719fedb2 not found: ID does not exist" containerID="8a071fed57c3098f869951e7c2b9303d68b8935b3f98ff6e40d0aa3a719fedb2" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.648349 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a071fed57c3098f869951e7c2b9303d68b8935b3f98ff6e40d0aa3a719fedb2"} err="failed to get container status \"8a071fed57c3098f869951e7c2b9303d68b8935b3f98ff6e40d0aa3a719fedb2\": rpc error: code = NotFound desc = could not find container \"8a071fed57c3098f869951e7c2b9303d68b8935b3f98ff6e40d0aa3a719fedb2\": container with ID starting with 8a071fed57c3098f869951e7c2b9303d68b8935b3f98ff6e40d0aa3a719fedb2 not found: ID does not exist" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.735668 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v26fb"] Oct 05 21:32:51 crc kubenswrapper[4753]: E1005 21:32:51.736009 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8f3008a-d05c-4bff-8cab-025b62c2c216" containerName="gather" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.736023 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f3008a-d05c-4bff-8cab-025b62c2c216" containerName="gather" Oct 05 21:32:51 crc kubenswrapper[4753]: E1005 21:32:51.736033 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8f3008a-d05c-4bff-8cab-025b62c2c216" containerName="copy" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.736039 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f3008a-d05c-4bff-8cab-025b62c2c216" containerName="copy" Oct 05 21:32:51 crc kubenswrapper[4753]: E1005 21:32:51.736059 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0aeb2d2-9f71-4665-99ba-4a02e4fc5206" containerName="extract-utilities" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.736066 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0aeb2d2-9f71-4665-99ba-4a02e4fc5206" containerName="extract-utilities" Oct 05 21:32:51 crc kubenswrapper[4753]: E1005 21:32:51.736092 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0aeb2d2-9f71-4665-99ba-4a02e4fc5206" containerName="extract-content" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.736099 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0aeb2d2-9f71-4665-99ba-4a02e4fc5206" containerName="extract-content" Oct 05 21:32:51 crc kubenswrapper[4753]: E1005 21:32:51.736112 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0aeb2d2-9f71-4665-99ba-4a02e4fc5206" containerName="registry-server" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.736119 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0aeb2d2-9f71-4665-99ba-4a02e4fc5206" containerName="registry-server" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.736482 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0aeb2d2-9f71-4665-99ba-4a02e4fc5206" containerName="registry-server" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.736505 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8f3008a-d05c-4bff-8cab-025b62c2c216" containerName="gather" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.736518 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8f3008a-d05c-4bff-8cab-025b62c2c216" containerName="copy" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.737925 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.765879 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v26fb"] Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.769233 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-utilities\") pod \"redhat-marketplace-v26fb\" (UID: \"d4efd21c-2c73-4ff9-b0de-41f2121b10fc\") " pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.769316 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2m5b\" (UniqueName: \"kubernetes.io/projected/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-kube-api-access-s2m5b\") pod \"redhat-marketplace-v26fb\" (UID: \"d4efd21c-2c73-4ff9-b0de-41f2121b10fc\") " pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.769421 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-catalog-content\") pod \"redhat-marketplace-v26fb\" (UID: \"d4efd21c-2c73-4ff9-b0de-41f2121b10fc\") " pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.870774 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8f3008a-d05c-4bff-8cab-025b62c2c216" path="/var/lib/kubelet/pods/a8f3008a-d05c-4bff-8cab-025b62c2c216/volumes" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.873474 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-catalog-content\") pod \"redhat-marketplace-v26fb\" (UID: \"d4efd21c-2c73-4ff9-b0de-41f2121b10fc\") " pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.873528 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-utilities\") pod \"redhat-marketplace-v26fb\" (UID: \"d4efd21c-2c73-4ff9-b0de-41f2121b10fc\") " pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.873581 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2m5b\" (UniqueName: \"kubernetes.io/projected/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-kube-api-access-s2m5b\") pod \"redhat-marketplace-v26fb\" (UID: \"d4efd21c-2c73-4ff9-b0de-41f2121b10fc\") " pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.874477 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-utilities\") pod \"redhat-marketplace-v26fb\" (UID: \"d4efd21c-2c73-4ff9-b0de-41f2121b10fc\") " pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.874682 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-catalog-content\") pod \"redhat-marketplace-v26fb\" (UID: \"d4efd21c-2c73-4ff9-b0de-41f2121b10fc\") " pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:32:51 crc kubenswrapper[4753]: I1005 21:32:51.904094 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2m5b\" (UniqueName: \"kubernetes.io/projected/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-kube-api-access-s2m5b\") pod \"redhat-marketplace-v26fb\" (UID: \"d4efd21c-2c73-4ff9-b0de-41f2121b10fc\") " pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:32:52 crc kubenswrapper[4753]: I1005 21:32:52.065029 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:32:52 crc kubenswrapper[4753]: I1005 21:32:52.518741 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v26fb"] Oct 05 21:32:52 crc kubenswrapper[4753]: I1005 21:32:52.561287 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v26fb" event={"ID":"d4efd21c-2c73-4ff9-b0de-41f2121b10fc","Type":"ContainerStarted","Data":"8a85873064c4748ba008e2b20855d9274b2476e8d9a6e90096d930180909b5d4"} Oct 05 21:32:53 crc kubenswrapper[4753]: I1005 21:32:53.570920 4753 generic.go:334] "Generic (PLEG): container finished" podID="d4efd21c-2c73-4ff9-b0de-41f2121b10fc" containerID="e3e3d2fbf601436230be5a37ddb4f041ffba8fa1b772eeab78bcf6e154604c69" exitCode=0 Oct 05 21:32:53 crc kubenswrapper[4753]: I1005 21:32:53.571156 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v26fb" event={"ID":"d4efd21c-2c73-4ff9-b0de-41f2121b10fc","Type":"ContainerDied","Data":"e3e3d2fbf601436230be5a37ddb4f041ffba8fa1b772eeab78bcf6e154604c69"} Oct 05 21:32:55 crc kubenswrapper[4753]: I1005 21:32:55.600439 4753 generic.go:334] "Generic (PLEG): container finished" podID="d4efd21c-2c73-4ff9-b0de-41f2121b10fc" containerID="7fcb6715123de32223e77ad91e9748317edc30f9a0290861151415bf209f5037" exitCode=0 Oct 05 21:32:55 crc kubenswrapper[4753]: I1005 21:32:55.600492 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v26fb" event={"ID":"d4efd21c-2c73-4ff9-b0de-41f2121b10fc","Type":"ContainerDied","Data":"7fcb6715123de32223e77ad91e9748317edc30f9a0290861151415bf209f5037"} Oct 05 21:32:56 crc kubenswrapper[4753]: I1005 21:32:56.612764 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v26fb" event={"ID":"d4efd21c-2c73-4ff9-b0de-41f2121b10fc","Type":"ContainerStarted","Data":"c502b26c3eed11c00c3aed1a9b8ebc5fe3154d3d32a45337ab46052f1cfc391c"} Oct 05 21:32:56 crc kubenswrapper[4753]: I1005 21:32:56.629018 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v26fb" podStartSLOduration=3.176487769 podStartE2EDuration="5.628982937s" podCreationTimestamp="2025-10-05 21:32:51 +0000 UTC" firstStartedPulling="2025-10-05 21:32:53.57350471 +0000 UTC m=+4682.421832952" lastFinishedPulling="2025-10-05 21:32:56.025999848 +0000 UTC m=+4684.874328120" observedRunningTime="2025-10-05 21:32:56.628490162 +0000 UTC m=+4685.476818394" watchObservedRunningTime="2025-10-05 21:32:56.628982937 +0000 UTC m=+4685.477311179" Oct 05 21:33:02 crc kubenswrapper[4753]: I1005 21:33:02.066003 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:33:02 crc kubenswrapper[4753]: I1005 21:33:02.067481 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:33:02 crc kubenswrapper[4753]: I1005 21:33:02.130862 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:33:02 crc kubenswrapper[4753]: I1005 21:33:02.757752 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:33:02 crc kubenswrapper[4753]: I1005 21:33:02.820325 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v26fb"] Oct 05 21:33:04 crc kubenswrapper[4753]: I1005 21:33:04.701863 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v26fb" podUID="d4efd21c-2c73-4ff9-b0de-41f2121b10fc" containerName="registry-server" containerID="cri-o://c502b26c3eed11c00c3aed1a9b8ebc5fe3154d3d32a45337ab46052f1cfc391c" gracePeriod=2 Oct 05 21:33:05 crc kubenswrapper[4753]: I1005 21:33:05.717298 4753 generic.go:334] "Generic (PLEG): container finished" podID="d4efd21c-2c73-4ff9-b0de-41f2121b10fc" containerID="c502b26c3eed11c00c3aed1a9b8ebc5fe3154d3d32a45337ab46052f1cfc391c" exitCode=0 Oct 05 21:33:05 crc kubenswrapper[4753]: I1005 21:33:05.717343 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v26fb" event={"ID":"d4efd21c-2c73-4ff9-b0de-41f2121b10fc","Type":"ContainerDied","Data":"c502b26c3eed11c00c3aed1a9b8ebc5fe3154d3d32a45337ab46052f1cfc391c"} Oct 05 21:33:05 crc kubenswrapper[4753]: I1005 21:33:05.717711 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v26fb" event={"ID":"d4efd21c-2c73-4ff9-b0de-41f2121b10fc","Type":"ContainerDied","Data":"8a85873064c4748ba008e2b20855d9274b2476e8d9a6e90096d930180909b5d4"} Oct 05 21:33:05 crc kubenswrapper[4753]: I1005 21:33:05.717734 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a85873064c4748ba008e2b20855d9274b2476e8d9a6e90096d930180909b5d4" Oct 05 21:33:05 crc kubenswrapper[4753]: I1005 21:33:05.790780 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:33:05 crc kubenswrapper[4753]: I1005 21:33:05.888002 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-utilities\") pod \"d4efd21c-2c73-4ff9-b0de-41f2121b10fc\" (UID: \"d4efd21c-2c73-4ff9-b0de-41f2121b10fc\") " Oct 05 21:33:05 crc kubenswrapper[4753]: I1005 21:33:05.888127 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-catalog-content\") pod \"d4efd21c-2c73-4ff9-b0de-41f2121b10fc\" (UID: \"d4efd21c-2c73-4ff9-b0de-41f2121b10fc\") " Oct 05 21:33:05 crc kubenswrapper[4753]: I1005 21:33:05.888504 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2m5b\" (UniqueName: \"kubernetes.io/projected/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-kube-api-access-s2m5b\") pod \"d4efd21c-2c73-4ff9-b0de-41f2121b10fc\" (UID: \"d4efd21c-2c73-4ff9-b0de-41f2121b10fc\") " Oct 05 21:33:05 crc kubenswrapper[4753]: I1005 21:33:05.889436 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-utilities" (OuterVolumeSpecName: "utilities") pod "d4efd21c-2c73-4ff9-b0de-41f2121b10fc" (UID: "d4efd21c-2c73-4ff9-b0de-41f2121b10fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:33:05 crc kubenswrapper[4753]: I1005 21:33:05.890078 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 21:33:05 crc kubenswrapper[4753]: I1005 21:33:05.915097 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4efd21c-2c73-4ff9-b0de-41f2121b10fc" (UID: "d4efd21c-2c73-4ff9-b0de-41f2121b10fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:33:05 crc kubenswrapper[4753]: I1005 21:33:05.923530 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-kube-api-access-s2m5b" (OuterVolumeSpecName: "kube-api-access-s2m5b") pod "d4efd21c-2c73-4ff9-b0de-41f2121b10fc" (UID: "d4efd21c-2c73-4ff9-b0de-41f2121b10fc"). InnerVolumeSpecName "kube-api-access-s2m5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:33:05 crc kubenswrapper[4753]: I1005 21:33:05.991469 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2m5b\" (UniqueName: \"kubernetes.io/projected/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-kube-api-access-s2m5b\") on node \"crc\" DevicePath \"\"" Oct 05 21:33:05 crc kubenswrapper[4753]: I1005 21:33:05.991891 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4efd21c-2c73-4ff9-b0de-41f2121b10fc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 21:33:06 crc kubenswrapper[4753]: I1005 21:33:06.455812 4753 scope.go:117] "RemoveContainer" containerID="cce6d019711360b0e1303c6875bbb315880b2d1b7fb0b6d6fcf150422bb10631" Oct 05 21:33:06 crc kubenswrapper[4753]: I1005 21:33:06.727520 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v26fb" Oct 05 21:33:06 crc kubenswrapper[4753]: I1005 21:33:06.777565 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v26fb"] Oct 05 21:33:06 crc kubenswrapper[4753]: I1005 21:33:06.788792 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v26fb"] Oct 05 21:33:07 crc kubenswrapper[4753]: I1005 21:33:07.874926 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4efd21c-2c73-4ff9-b0de-41f2121b10fc" path="/var/lib/kubelet/pods/d4efd21c-2c73-4ff9-b0de-41f2121b10fc/volumes" Oct 05 21:33:26 crc kubenswrapper[4753]: I1005 21:33:26.294426 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wr5ml/must-gather-rwvpx"] Oct 05 21:33:26 crc kubenswrapper[4753]: E1005 21:33:26.296243 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4efd21c-2c73-4ff9-b0de-41f2121b10fc" containerName="registry-server" Oct 05 21:33:26 crc kubenswrapper[4753]: I1005 21:33:26.296357 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4efd21c-2c73-4ff9-b0de-41f2121b10fc" containerName="registry-server" Oct 05 21:33:26 crc kubenswrapper[4753]: E1005 21:33:26.296422 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4efd21c-2c73-4ff9-b0de-41f2121b10fc" containerName="extract-utilities" Oct 05 21:33:26 crc kubenswrapper[4753]: I1005 21:33:26.296477 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4efd21c-2c73-4ff9-b0de-41f2121b10fc" containerName="extract-utilities" Oct 05 21:33:26 crc kubenswrapper[4753]: E1005 21:33:26.296602 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4efd21c-2c73-4ff9-b0de-41f2121b10fc" containerName="extract-content" Oct 05 21:33:26 crc kubenswrapper[4753]: I1005 21:33:26.296611 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4efd21c-2c73-4ff9-b0de-41f2121b10fc" containerName="extract-content" Oct 05 21:33:26 crc kubenswrapper[4753]: I1005 21:33:26.296959 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4efd21c-2c73-4ff9-b0de-41f2121b10fc" containerName="registry-server" Oct 05 21:33:26 crc kubenswrapper[4753]: I1005 21:33:26.298054 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr5ml/must-gather-rwvpx" Oct 05 21:33:26 crc kubenswrapper[4753]: I1005 21:33:26.306257 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wr5ml"/"openshift-service-ca.crt" Oct 05 21:33:26 crc kubenswrapper[4753]: I1005 21:33:26.307007 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wr5ml"/"kube-root-ca.crt" Oct 05 21:33:26 crc kubenswrapper[4753]: I1005 21:33:26.312108 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wr5ml/must-gather-rwvpx"] Oct 05 21:33:26 crc kubenswrapper[4753]: I1005 21:33:26.332947 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d98a546-ae8b-432d-a788-79fcac33bcd3-must-gather-output\") pod \"must-gather-rwvpx\" (UID: \"1d98a546-ae8b-432d-a788-79fcac33bcd3\") " pod="openshift-must-gather-wr5ml/must-gather-rwvpx" Oct 05 21:33:26 crc kubenswrapper[4753]: I1005 21:33:26.332987 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25flr\" (UniqueName: \"kubernetes.io/projected/1d98a546-ae8b-432d-a788-79fcac33bcd3-kube-api-access-25flr\") pod \"must-gather-rwvpx\" (UID: \"1d98a546-ae8b-432d-a788-79fcac33bcd3\") " pod="openshift-must-gather-wr5ml/must-gather-rwvpx" Oct 05 21:33:26 crc kubenswrapper[4753]: I1005 21:33:26.434009 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d98a546-ae8b-432d-a788-79fcac33bcd3-must-gather-output\") pod \"must-gather-rwvpx\" (UID: \"1d98a546-ae8b-432d-a788-79fcac33bcd3\") " pod="openshift-must-gather-wr5ml/must-gather-rwvpx" Oct 05 21:33:26 crc kubenswrapper[4753]: I1005 21:33:26.434058 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25flr\" (UniqueName: \"kubernetes.io/projected/1d98a546-ae8b-432d-a788-79fcac33bcd3-kube-api-access-25flr\") pod \"must-gather-rwvpx\" (UID: \"1d98a546-ae8b-432d-a788-79fcac33bcd3\") " pod="openshift-must-gather-wr5ml/must-gather-rwvpx" Oct 05 21:33:26 crc kubenswrapper[4753]: I1005 21:33:26.434749 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d98a546-ae8b-432d-a788-79fcac33bcd3-must-gather-output\") pod \"must-gather-rwvpx\" (UID: \"1d98a546-ae8b-432d-a788-79fcac33bcd3\") " pod="openshift-must-gather-wr5ml/must-gather-rwvpx" Oct 05 21:33:26 crc kubenswrapper[4753]: I1005 21:33:26.481330 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25flr\" (UniqueName: \"kubernetes.io/projected/1d98a546-ae8b-432d-a788-79fcac33bcd3-kube-api-access-25flr\") pod \"must-gather-rwvpx\" (UID: \"1d98a546-ae8b-432d-a788-79fcac33bcd3\") " pod="openshift-must-gather-wr5ml/must-gather-rwvpx" Oct 05 21:33:26 crc kubenswrapper[4753]: I1005 21:33:26.624490 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr5ml/must-gather-rwvpx" Oct 05 21:33:27 crc kubenswrapper[4753]: I1005 21:33:27.203547 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wr5ml/must-gather-rwvpx"] Oct 05 21:33:27 crc kubenswrapper[4753]: W1005 21:33:27.216704 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d98a546_ae8b_432d_a788_79fcac33bcd3.slice/crio-2513ba50ed1ba4363acc8b0d3865faf12582bc1cd811f92a9eb0a18c00acd362 WatchSource:0}: Error finding container 2513ba50ed1ba4363acc8b0d3865faf12582bc1cd811f92a9eb0a18c00acd362: Status 404 returned error can't find the container with id 2513ba50ed1ba4363acc8b0d3865faf12582bc1cd811f92a9eb0a18c00acd362 Oct 05 21:33:27 crc kubenswrapper[4753]: I1005 21:33:27.954092 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wr5ml/must-gather-rwvpx" event={"ID":"1d98a546-ae8b-432d-a788-79fcac33bcd3","Type":"ContainerStarted","Data":"7a9f95472aba224d1d97ff0a74fdda4913be2ac1a56bbb71f9f2a264af3a9cbe"} Oct 05 21:33:27 crc kubenswrapper[4753]: I1005 21:33:27.954497 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wr5ml/must-gather-rwvpx" event={"ID":"1d98a546-ae8b-432d-a788-79fcac33bcd3","Type":"ContainerStarted","Data":"5f42796b4154efedeaad017dc94f7c0f88a7e70e55236a22659942ad0bd8f3a8"} Oct 05 21:33:27 crc kubenswrapper[4753]: I1005 21:33:27.954507 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wr5ml/must-gather-rwvpx" event={"ID":"1d98a546-ae8b-432d-a788-79fcac33bcd3","Type":"ContainerStarted","Data":"2513ba50ed1ba4363acc8b0d3865faf12582bc1cd811f92a9eb0a18c00acd362"} Oct 05 21:33:27 crc kubenswrapper[4753]: I1005 21:33:27.971591 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wr5ml/must-gather-rwvpx" podStartSLOduration=1.9715735300000001 podStartE2EDuration="1.97157353s" podCreationTimestamp="2025-10-05 21:33:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 21:33:27.965842284 +0000 UTC m=+4716.814170516" watchObservedRunningTime="2025-10-05 21:33:27.97157353 +0000 UTC m=+4716.819901762" Oct 05 21:33:30 crc kubenswrapper[4753]: E1005 21:33:30.722625 4753 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.75:36774->38.102.83.75:35117: read tcp 38.102.83.75:36774->38.102.83.75:35117: read: connection reset by peer Oct 05 21:33:32 crc kubenswrapper[4753]: I1005 21:33:32.214476 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wr5ml/crc-debug-k8n7s"] Oct 05 21:33:32 crc kubenswrapper[4753]: I1005 21:33:32.215978 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr5ml/crc-debug-k8n7s" Oct 05 21:33:32 crc kubenswrapper[4753]: I1005 21:33:32.218537 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wr5ml"/"default-dockercfg-x66q6" Oct 05 21:33:32 crc kubenswrapper[4753]: I1005 21:33:32.292065 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlmtw\" (UniqueName: \"kubernetes.io/projected/d843dab7-0131-4469-9d9b-d8fcde0368d3-kube-api-access-zlmtw\") pod \"crc-debug-k8n7s\" (UID: \"d843dab7-0131-4469-9d9b-d8fcde0368d3\") " pod="openshift-must-gather-wr5ml/crc-debug-k8n7s" Oct 05 21:33:32 crc kubenswrapper[4753]: I1005 21:33:32.292240 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d843dab7-0131-4469-9d9b-d8fcde0368d3-host\") pod \"crc-debug-k8n7s\" (UID: \"d843dab7-0131-4469-9d9b-d8fcde0368d3\") " pod="openshift-must-gather-wr5ml/crc-debug-k8n7s" Oct 05 21:33:32 crc kubenswrapper[4753]: I1005 21:33:32.394464 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlmtw\" (UniqueName: \"kubernetes.io/projected/d843dab7-0131-4469-9d9b-d8fcde0368d3-kube-api-access-zlmtw\") pod \"crc-debug-k8n7s\" (UID: \"d843dab7-0131-4469-9d9b-d8fcde0368d3\") " pod="openshift-must-gather-wr5ml/crc-debug-k8n7s" Oct 05 21:33:32 crc kubenswrapper[4753]: I1005 21:33:32.394530 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d843dab7-0131-4469-9d9b-d8fcde0368d3-host\") pod \"crc-debug-k8n7s\" (UID: \"d843dab7-0131-4469-9d9b-d8fcde0368d3\") " pod="openshift-must-gather-wr5ml/crc-debug-k8n7s" Oct 05 21:33:32 crc kubenswrapper[4753]: I1005 21:33:32.394698 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d843dab7-0131-4469-9d9b-d8fcde0368d3-host\") pod \"crc-debug-k8n7s\" (UID: \"d843dab7-0131-4469-9d9b-d8fcde0368d3\") " pod="openshift-must-gather-wr5ml/crc-debug-k8n7s" Oct 05 21:33:32 crc kubenswrapper[4753]: I1005 21:33:32.416056 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlmtw\" (UniqueName: \"kubernetes.io/projected/d843dab7-0131-4469-9d9b-d8fcde0368d3-kube-api-access-zlmtw\") pod \"crc-debug-k8n7s\" (UID: \"d843dab7-0131-4469-9d9b-d8fcde0368d3\") " pod="openshift-must-gather-wr5ml/crc-debug-k8n7s" Oct 05 21:33:32 crc kubenswrapper[4753]: I1005 21:33:32.534900 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr5ml/crc-debug-k8n7s" Oct 05 21:33:33 crc kubenswrapper[4753]: I1005 21:33:33.000372 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wr5ml/crc-debug-k8n7s" event={"ID":"d843dab7-0131-4469-9d9b-d8fcde0368d3","Type":"ContainerStarted","Data":"bfbd6f27c782f252c6ca2d0ed708bcc0e56d3656d3be02c6d06c915536356c51"} Oct 05 21:33:33 crc kubenswrapper[4753]: I1005 21:33:33.000780 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wr5ml/crc-debug-k8n7s" event={"ID":"d843dab7-0131-4469-9d9b-d8fcde0368d3","Type":"ContainerStarted","Data":"c00f879d4132919e1377a9328f59558e1579722ff523e300fdb140d98ef39b5f"} Oct 05 21:33:33 crc kubenswrapper[4753]: I1005 21:33:33.020537 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wr5ml/crc-debug-k8n7s" podStartSLOduration=1.020518974 podStartE2EDuration="1.020518974s" podCreationTimestamp="2025-10-05 21:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-05 21:33:33.014981452 +0000 UTC m=+4721.863309684" watchObservedRunningTime="2025-10-05 21:33:33.020518974 +0000 UTC m=+4721.868847216" Oct 05 21:34:13 crc kubenswrapper[4753]: I1005 21:34:13.770588 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hn95q"] Oct 05 21:34:13 crc kubenswrapper[4753]: I1005 21:34:13.772844 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:13 crc kubenswrapper[4753]: I1005 21:34:13.807870 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hn95q"] Oct 05 21:34:13 crc kubenswrapper[4753]: I1005 21:34:13.947698 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e4b4643-7e36-4218-8fd1-c819eb806de1-catalog-content\") pod \"certified-operators-hn95q\" (UID: \"3e4b4643-7e36-4218-8fd1-c819eb806de1\") " pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:13 crc kubenswrapper[4753]: I1005 21:34:13.947755 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e4b4643-7e36-4218-8fd1-c819eb806de1-utilities\") pod \"certified-operators-hn95q\" (UID: \"3e4b4643-7e36-4218-8fd1-c819eb806de1\") " pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:13 crc kubenswrapper[4753]: I1005 21:34:13.947871 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgltp\" (UniqueName: \"kubernetes.io/projected/3e4b4643-7e36-4218-8fd1-c819eb806de1-kube-api-access-pgltp\") pod \"certified-operators-hn95q\" (UID: \"3e4b4643-7e36-4218-8fd1-c819eb806de1\") " pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:14 crc kubenswrapper[4753]: I1005 21:34:14.049334 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgltp\" (UniqueName: \"kubernetes.io/projected/3e4b4643-7e36-4218-8fd1-c819eb806de1-kube-api-access-pgltp\") pod \"certified-operators-hn95q\" (UID: \"3e4b4643-7e36-4218-8fd1-c819eb806de1\") " pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:14 crc kubenswrapper[4753]: I1005 21:34:14.049428 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e4b4643-7e36-4218-8fd1-c819eb806de1-catalog-content\") pod \"certified-operators-hn95q\" (UID: \"3e4b4643-7e36-4218-8fd1-c819eb806de1\") " pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:14 crc kubenswrapper[4753]: I1005 21:34:14.049457 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e4b4643-7e36-4218-8fd1-c819eb806de1-utilities\") pod \"certified-operators-hn95q\" (UID: \"3e4b4643-7e36-4218-8fd1-c819eb806de1\") " pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:14 crc kubenswrapper[4753]: I1005 21:34:14.050013 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e4b4643-7e36-4218-8fd1-c819eb806de1-catalog-content\") pod \"certified-operators-hn95q\" (UID: \"3e4b4643-7e36-4218-8fd1-c819eb806de1\") " pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:14 crc kubenswrapper[4753]: I1005 21:34:14.050061 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e4b4643-7e36-4218-8fd1-c819eb806de1-utilities\") pod \"certified-operators-hn95q\" (UID: \"3e4b4643-7e36-4218-8fd1-c819eb806de1\") " pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:14 crc kubenswrapper[4753]: I1005 21:34:14.543121 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgltp\" (UniqueName: \"kubernetes.io/projected/3e4b4643-7e36-4218-8fd1-c819eb806de1-kube-api-access-pgltp\") pod \"certified-operators-hn95q\" (UID: \"3e4b4643-7e36-4218-8fd1-c819eb806de1\") " pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:14 crc kubenswrapper[4753]: I1005 21:34:14.703642 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:15 crc kubenswrapper[4753]: I1005 21:34:15.490892 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hn95q"] Oct 05 21:34:16 crc kubenswrapper[4753]: I1005 21:34:16.358708 4753 generic.go:334] "Generic (PLEG): container finished" podID="3e4b4643-7e36-4218-8fd1-c819eb806de1" containerID="971dd4b50613508882da83c592f09a1ea0cf6c3e7e4da6859d2919067c1c08df" exitCode=0 Oct 05 21:34:16 crc kubenswrapper[4753]: I1005 21:34:16.358980 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hn95q" event={"ID":"3e4b4643-7e36-4218-8fd1-c819eb806de1","Type":"ContainerDied","Data":"971dd4b50613508882da83c592f09a1ea0cf6c3e7e4da6859d2919067c1c08df"} Oct 05 21:34:16 crc kubenswrapper[4753]: I1005 21:34:16.359003 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hn95q" event={"ID":"3e4b4643-7e36-4218-8fd1-c819eb806de1","Type":"ContainerStarted","Data":"ddc00811dcf74a4abd2244743d6109b0dc1339dcff12b8116616f546a98bda15"} Oct 05 21:34:17 crc kubenswrapper[4753]: I1005 21:34:17.372533 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hn95q" event={"ID":"3e4b4643-7e36-4218-8fd1-c819eb806de1","Type":"ContainerStarted","Data":"78f5272b576216a314a268042395012aed157c3674fe9ae179faa703c84dca16"} Oct 05 21:34:19 crc kubenswrapper[4753]: I1005 21:34:19.394349 4753 generic.go:334] "Generic (PLEG): container finished" podID="3e4b4643-7e36-4218-8fd1-c819eb806de1" containerID="78f5272b576216a314a268042395012aed157c3674fe9ae179faa703c84dca16" exitCode=0 Oct 05 21:34:19 crc kubenswrapper[4753]: I1005 21:34:19.394432 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hn95q" event={"ID":"3e4b4643-7e36-4218-8fd1-c819eb806de1","Type":"ContainerDied","Data":"78f5272b576216a314a268042395012aed157c3674fe9ae179faa703c84dca16"} Oct 05 21:34:20 crc kubenswrapper[4753]: I1005 21:34:20.412310 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hn95q" event={"ID":"3e4b4643-7e36-4218-8fd1-c819eb806de1","Type":"ContainerStarted","Data":"077d7f67816fbc874ce4f80bc39bef21ce2cf38489828e186397de5bd8f0b866"} Oct 05 21:34:20 crc kubenswrapper[4753]: I1005 21:34:20.430488 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hn95q" podStartSLOduration=3.950381266 podStartE2EDuration="7.430471994s" podCreationTimestamp="2025-10-05 21:34:13 +0000 UTC" firstStartedPulling="2025-10-05 21:34:16.360485901 +0000 UTC m=+4765.208814133" lastFinishedPulling="2025-10-05 21:34:19.840576629 +0000 UTC m=+4768.688904861" observedRunningTime="2025-10-05 21:34:20.426403098 +0000 UTC m=+4769.274731330" watchObservedRunningTime="2025-10-05 21:34:20.430471994 +0000 UTC m=+4769.278800226" Oct 05 21:34:24 crc kubenswrapper[4753]: I1005 21:34:24.705212 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:24 crc kubenswrapper[4753]: I1005 21:34:24.705816 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:25 crc kubenswrapper[4753]: I1005 21:34:25.757573 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-hn95q" podUID="3e4b4643-7e36-4218-8fd1-c819eb806de1" containerName="registry-server" probeResult="failure" output=< Oct 05 21:34:25 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Oct 05 21:34:25 crc kubenswrapper[4753]: > Oct 05 21:34:34 crc kubenswrapper[4753]: I1005 21:34:34.746653 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:34 crc kubenswrapper[4753]: I1005 21:34:34.816515 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:34 crc kubenswrapper[4753]: I1005 21:34:34.979237 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hn95q"] Oct 05 21:34:36 crc kubenswrapper[4753]: I1005 21:34:36.548316 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hn95q" podUID="3e4b4643-7e36-4218-8fd1-c819eb806de1" containerName="registry-server" containerID="cri-o://077d7f67816fbc874ce4f80bc39bef21ce2cf38489828e186397de5bd8f0b866" gracePeriod=2 Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.063744 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.126280 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e4b4643-7e36-4218-8fd1-c819eb806de1-utilities\") pod \"3e4b4643-7e36-4218-8fd1-c819eb806de1\" (UID: \"3e4b4643-7e36-4218-8fd1-c819eb806de1\") " Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.126403 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e4b4643-7e36-4218-8fd1-c819eb806de1-catalog-content\") pod \"3e4b4643-7e36-4218-8fd1-c819eb806de1\" (UID: \"3e4b4643-7e36-4218-8fd1-c819eb806de1\") " Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.126461 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgltp\" (UniqueName: \"kubernetes.io/projected/3e4b4643-7e36-4218-8fd1-c819eb806de1-kube-api-access-pgltp\") pod \"3e4b4643-7e36-4218-8fd1-c819eb806de1\" (UID: \"3e4b4643-7e36-4218-8fd1-c819eb806de1\") " Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.127026 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e4b4643-7e36-4218-8fd1-c819eb806de1-utilities" (OuterVolumeSpecName: "utilities") pod "3e4b4643-7e36-4218-8fd1-c819eb806de1" (UID: "3e4b4643-7e36-4218-8fd1-c819eb806de1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.128055 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e4b4643-7e36-4218-8fd1-c819eb806de1-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.132377 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e4b4643-7e36-4218-8fd1-c819eb806de1-kube-api-access-pgltp" (OuterVolumeSpecName: "kube-api-access-pgltp") pod "3e4b4643-7e36-4218-8fd1-c819eb806de1" (UID: "3e4b4643-7e36-4218-8fd1-c819eb806de1"). InnerVolumeSpecName "kube-api-access-pgltp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.185368 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e4b4643-7e36-4218-8fd1-c819eb806de1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e4b4643-7e36-4218-8fd1-c819eb806de1" (UID: "3e4b4643-7e36-4218-8fd1-c819eb806de1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.230648 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e4b4643-7e36-4218-8fd1-c819eb806de1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.230680 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgltp\" (UniqueName: \"kubernetes.io/projected/3e4b4643-7e36-4218-8fd1-c819eb806de1-kube-api-access-pgltp\") on node \"crc\" DevicePath \"\"" Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.561187 4753 generic.go:334] "Generic (PLEG): container finished" podID="3e4b4643-7e36-4218-8fd1-c819eb806de1" containerID="077d7f67816fbc874ce4f80bc39bef21ce2cf38489828e186397de5bd8f0b866" exitCode=0 Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.561223 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hn95q" event={"ID":"3e4b4643-7e36-4218-8fd1-c819eb806de1","Type":"ContainerDied","Data":"077d7f67816fbc874ce4f80bc39bef21ce2cf38489828e186397de5bd8f0b866"} Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.561248 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hn95q" event={"ID":"3e4b4643-7e36-4218-8fd1-c819eb806de1","Type":"ContainerDied","Data":"ddc00811dcf74a4abd2244743d6109b0dc1339dcff12b8116616f546a98bda15"} Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.561265 4753 scope.go:117] "RemoveContainer" containerID="077d7f67816fbc874ce4f80bc39bef21ce2cf38489828e186397de5bd8f0b866" Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.561283 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hn95q" Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.600635 4753 scope.go:117] "RemoveContainer" containerID="78f5272b576216a314a268042395012aed157c3674fe9ae179faa703c84dca16" Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.611945 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hn95q"] Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.619465 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hn95q"] Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.632160 4753 scope.go:117] "RemoveContainer" containerID="971dd4b50613508882da83c592f09a1ea0cf6c3e7e4da6859d2919067c1c08df" Oct 05 21:34:37 crc kubenswrapper[4753]: I1005 21:34:37.862898 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e4b4643-7e36-4218-8fd1-c819eb806de1" path="/var/lib/kubelet/pods/3e4b4643-7e36-4218-8fd1-c819eb806de1/volumes" Oct 05 21:34:38 crc kubenswrapper[4753]: I1005 21:34:38.154125 4753 scope.go:117] "RemoveContainer" containerID="077d7f67816fbc874ce4f80bc39bef21ce2cf38489828e186397de5bd8f0b866" Oct 05 21:34:38 crc kubenswrapper[4753]: E1005 21:34:38.154657 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"077d7f67816fbc874ce4f80bc39bef21ce2cf38489828e186397de5bd8f0b866\": container with ID starting with 077d7f67816fbc874ce4f80bc39bef21ce2cf38489828e186397de5bd8f0b866 not found: ID does not exist" containerID="077d7f67816fbc874ce4f80bc39bef21ce2cf38489828e186397de5bd8f0b866" Oct 05 21:34:38 crc kubenswrapper[4753]: I1005 21:34:38.154704 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"077d7f67816fbc874ce4f80bc39bef21ce2cf38489828e186397de5bd8f0b866"} err="failed to get container status \"077d7f67816fbc874ce4f80bc39bef21ce2cf38489828e186397de5bd8f0b866\": rpc error: code = NotFound desc = could not find container \"077d7f67816fbc874ce4f80bc39bef21ce2cf38489828e186397de5bd8f0b866\": container with ID starting with 077d7f67816fbc874ce4f80bc39bef21ce2cf38489828e186397de5bd8f0b866 not found: ID does not exist" Oct 05 21:34:38 crc kubenswrapper[4753]: I1005 21:34:38.154736 4753 scope.go:117] "RemoveContainer" containerID="78f5272b576216a314a268042395012aed157c3674fe9ae179faa703c84dca16" Oct 05 21:34:38 crc kubenswrapper[4753]: E1005 21:34:38.155460 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78f5272b576216a314a268042395012aed157c3674fe9ae179faa703c84dca16\": container with ID starting with 78f5272b576216a314a268042395012aed157c3674fe9ae179faa703c84dca16 not found: ID does not exist" containerID="78f5272b576216a314a268042395012aed157c3674fe9ae179faa703c84dca16" Oct 05 21:34:38 crc kubenswrapper[4753]: I1005 21:34:38.155505 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78f5272b576216a314a268042395012aed157c3674fe9ae179faa703c84dca16"} err="failed to get container status \"78f5272b576216a314a268042395012aed157c3674fe9ae179faa703c84dca16\": rpc error: code = NotFound desc = could not find container \"78f5272b576216a314a268042395012aed157c3674fe9ae179faa703c84dca16\": container with ID starting with 78f5272b576216a314a268042395012aed157c3674fe9ae179faa703c84dca16 not found: ID does not exist" Oct 05 21:34:38 crc kubenswrapper[4753]: I1005 21:34:38.155531 4753 scope.go:117] "RemoveContainer" containerID="971dd4b50613508882da83c592f09a1ea0cf6c3e7e4da6859d2919067c1c08df" Oct 05 21:34:38 crc kubenswrapper[4753]: E1005 21:34:38.155810 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"971dd4b50613508882da83c592f09a1ea0cf6c3e7e4da6859d2919067c1c08df\": container with ID starting with 971dd4b50613508882da83c592f09a1ea0cf6c3e7e4da6859d2919067c1c08df not found: ID does not exist" containerID="971dd4b50613508882da83c592f09a1ea0cf6c3e7e4da6859d2919067c1c08df" Oct 05 21:34:38 crc kubenswrapper[4753]: I1005 21:34:38.155836 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"971dd4b50613508882da83c592f09a1ea0cf6c3e7e4da6859d2919067c1c08df"} err="failed to get container status \"971dd4b50613508882da83c592f09a1ea0cf6c3e7e4da6859d2919067c1c08df\": rpc error: code = NotFound desc = could not find container \"971dd4b50613508882da83c592f09a1ea0cf6c3e7e4da6859d2919067c1c08df\": container with ID starting with 971dd4b50613508882da83c592f09a1ea0cf6c3e7e4da6859d2919067c1c08df not found: ID does not exist" Oct 05 21:35:04 crc kubenswrapper[4753]: I1005 21:35:04.489696 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:35:04 crc kubenswrapper[4753]: I1005 21:35:04.490382 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:35:06 crc kubenswrapper[4753]: I1005 21:35:06.590213 4753 scope.go:117] "RemoveContainer" containerID="c3123e275c3787d7febe754fc369d3797c33babaa22b774cfe4ca3de5bf8c8eb" Oct 05 21:35:17 crc kubenswrapper[4753]: I1005 21:35:17.349010 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-568f5b5b96-6t6qd_75308018-c2d6-42ef-9776-f3b861ec86ed/barbican-api/0.log" Oct 05 21:35:17 crc kubenswrapper[4753]: I1005 21:35:17.424043 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-568f5b5b96-6t6qd_75308018-c2d6-42ef-9776-f3b861ec86ed/barbican-api-log/0.log" Oct 05 21:35:17 crc kubenswrapper[4753]: I1005 21:35:17.717551 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-fc7ffbf98-q7225_1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c/barbican-keystone-listener/0.log" Oct 05 21:35:17 crc kubenswrapper[4753]: I1005 21:35:17.749389 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-fc7ffbf98-q7225_1d4e3a5d-5755-47cc-82af-1e28c4d1cb9c/barbican-keystone-listener-log/0.log" Oct 05 21:35:17 crc kubenswrapper[4753]: I1005 21:35:17.960387 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-c895b6649-pwwkn_4f08b26d-6499-4aac-93c6-7d07e4e98d47/barbican-worker-log/0.log" Oct 05 21:35:17 crc kubenswrapper[4753]: I1005 21:35:17.964554 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-c895b6649-pwwkn_4f08b26d-6499-4aac-93c6-7d07e4e98d47/barbican-worker/0.log" Oct 05 21:35:18 crc kubenswrapper[4753]: I1005 21:35:18.196164 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-rvkzv_2e0a083e-4f35-4cbf-89af-348a03a81159/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:35:18 crc kubenswrapper[4753]: I1005 21:35:18.320617 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_68a4d310-a272-4033-af72-dfc6e8c239f6/ceilometer-central-agent/0.log" Oct 05 21:35:18 crc kubenswrapper[4753]: I1005 21:35:18.396366 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_68a4d310-a272-4033-af72-dfc6e8c239f6/ceilometer-notification-agent/0.log" Oct 05 21:35:18 crc kubenswrapper[4753]: I1005 21:35:18.453159 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_68a4d310-a272-4033-af72-dfc6e8c239f6/proxy-httpd/0.log" Oct 05 21:35:18 crc kubenswrapper[4753]: I1005 21:35:18.545813 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_68a4d310-a272-4033-af72-dfc6e8c239f6/sg-core/0.log" Oct 05 21:35:18 crc kubenswrapper[4753]: I1005 21:35:18.673952 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-d77pj_a3425a91-733d-43c0-b7af-42914da99374/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:35:18 crc kubenswrapper[4753]: I1005 21:35:18.859722 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-cmj42_73e27d2b-d430-4df5-9380-e3b3f6a75420/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:35:19 crc kubenswrapper[4753]: I1005 21:35:19.082561 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ba2c30bc-68b4-4803-852e-b12fe770196d/cinder-api/0.log" Oct 05 21:35:19 crc kubenswrapper[4753]: I1005 21:35:19.104479 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ba2c30bc-68b4-4803-852e-b12fe770196d/cinder-api-log/0.log" Oct 05 21:35:19 crc kubenswrapper[4753]: I1005 21:35:19.388909 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_adbbbc89-97ba-492f-a842-c9bf33a69480/probe/0.log" Oct 05 21:35:19 crc kubenswrapper[4753]: I1005 21:35:19.492689 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_adbbbc89-97ba-492f-a842-c9bf33a69480/cinder-backup/0.log" Oct 05 21:35:19 crc kubenswrapper[4753]: I1005 21:35:19.740899 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_56bdb919-1995-4b2a-855b-4d7ece37ce4c/cinder-scheduler/0.log" Oct 05 21:35:19 crc kubenswrapper[4753]: I1005 21:35:19.800808 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_56bdb919-1995-4b2a-855b-4d7ece37ce4c/probe/0.log" Oct 05 21:35:19 crc kubenswrapper[4753]: I1005 21:35:19.989186 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_45a5357e-d55a-4532-aaff-fe090b71fc60/cinder-volume/0.log" Oct 05 21:35:20 crc kubenswrapper[4753]: I1005 21:35:20.057185 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_45a5357e-d55a-4532-aaff-fe090b71fc60/probe/0.log" Oct 05 21:35:20 crc kubenswrapper[4753]: I1005 21:35:20.796679 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-mf5mn_a6ae6dc6-0e0c-4e51-8426-bf6dceb75f37/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:35:20 crc kubenswrapper[4753]: I1005 21:35:20.851933 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-8zzrj_f21504e5-2012-4b4a-a3fc-16e6dc364373/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:35:21 crc kubenswrapper[4753]: I1005 21:35:21.027846 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67948f47bf-jnd5v_16d16fc5-ebf6-49b5-a837-7b19a005ee21/init/0.log" Oct 05 21:35:21 crc kubenswrapper[4753]: I1005 21:35:21.378901 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e4e4554e-b923-40f1-ac86-abc4cb871d21/glance-httpd/0.log" Oct 05 21:35:21 crc kubenswrapper[4753]: I1005 21:35:21.423807 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67948f47bf-jnd5v_16d16fc5-ebf6-49b5-a837-7b19a005ee21/init/0.log" Oct 05 21:35:21 crc kubenswrapper[4753]: I1005 21:35:21.514656 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67948f47bf-jnd5v_16d16fc5-ebf6-49b5-a837-7b19a005ee21/dnsmasq-dns/0.log" Oct 05 21:35:22 crc kubenswrapper[4753]: I1005 21:35:22.180560 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4606d5be-d97d-4c1b-95df-1aad021ced17/glance-httpd/0.log" Oct 05 21:35:22 crc kubenswrapper[4753]: I1005 21:35:22.239865 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e4e4554e-b923-40f1-ac86-abc4cb871d21/glance-log/0.log" Oct 05 21:35:22 crc kubenswrapper[4753]: I1005 21:35:22.249501 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4606d5be-d97d-4c1b-95df-1aad021ced17/glance-log/0.log" Oct 05 21:35:22 crc kubenswrapper[4753]: I1005 21:35:22.643252 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-745b9fcf5d-xkxjq_e1309d62-7702-49bc-892f-705d8ac9fff3/horizon/0.log" Oct 05 21:35:22 crc kubenswrapper[4753]: I1005 21:35:22.705930 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-745b9fcf5d-xkxjq_e1309d62-7702-49bc-892f-705d8ac9fff3/horizon-log/0.log" Oct 05 21:35:23 crc kubenswrapper[4753]: I1005 21:35:23.021733 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-r5tfd_13f3a6ea-8a17-4bf8-a252-f53e5856466a/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:35:23 crc kubenswrapper[4753]: I1005 21:35:23.126672 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4c5cn_d5a16b03-a799-4548-8a7f-bf73d3f4a52a/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:35:23 crc kubenswrapper[4753]: I1005 21:35:23.409997 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29328301-rlwxs_383607d3-fca4-477a-a189-c6aab8192496/keystone-cron/0.log" Oct 05 21:35:23 crc kubenswrapper[4753]: I1005 21:35:23.505440 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-85d68b7848-bh92h_bdbb6c59-3c98-4b88-a1aa-7304476a522a/keystone-api/0.log" Oct 05 21:35:23 crc kubenswrapper[4753]: I1005 21:35:23.551006 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5dd5b3b0-432b-4040-8544-d68497fca1de/kube-state-metrics/0.log" Oct 05 21:35:23 crc kubenswrapper[4753]: I1005 21:35:23.770016 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-nrg62_9f393cda-bc70-44d4-a534-a72b71dcf0b7/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:35:23 crc kubenswrapper[4753]: I1005 21:35:23.919398 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_d1597495-6f4a-4887-bacc-8082ad9784d4/manila-api/0.log" Oct 05 21:35:23 crc kubenswrapper[4753]: I1005 21:35:23.942527 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_d1597495-6f4a-4887-bacc-8082ad9784d4/manila-api-log/0.log" Oct 05 21:35:24 crc kubenswrapper[4753]: I1005 21:35:24.104507 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a/manila-scheduler/0.log" Oct 05 21:35:24 crc kubenswrapper[4753]: I1005 21:35:24.157664 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_7ceefd75-1ba6-4518-8903-c9ac6c2d8e6a/probe/0.log" Oct 05 21:35:24 crc kubenswrapper[4753]: I1005 21:35:24.348716 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_d8dd124d-011e-41dd-813b-b16ad8039461/manila-share/0.log" Oct 05 21:35:24 crc kubenswrapper[4753]: I1005 21:35:24.411550 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_d8dd124d-011e-41dd-813b-b16ad8039461/probe/0.log" Oct 05 21:35:24 crc kubenswrapper[4753]: I1005 21:35:24.858204 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-79cfb6d465-74j5v_e26c6617-558f-445a-be5b-02578e006437/neutron-api/0.log" Oct 05 21:35:24 crc kubenswrapper[4753]: I1005 21:35:24.900897 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-79cfb6d465-74j5v_e26c6617-558f-445a-be5b-02578e006437/neutron-httpd/0.log" Oct 05 21:35:25 crc kubenswrapper[4753]: I1005 21:35:25.203435 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-9h96q_e2d62f22-5b64-4b0d-9ee9-e3d4d07b7b85/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:35:25 crc kubenswrapper[4753]: I1005 21:35:25.809818 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_fd2971d2-d61f-4268-9366-6e11ae7f71bc/nova-api-log/0.log" Oct 05 21:35:26 crc kubenswrapper[4753]: I1005 21:35:26.114694 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_d8ddb5b3-36ec-421d-a5d0-f465f7cf0316/nova-cell0-conductor-conductor/0.log" Oct 05 21:35:26 crc kubenswrapper[4753]: I1005 21:35:26.160608 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_fd2971d2-d61f-4268-9366-6e11ae7f71bc/nova-api-api/0.log" Oct 05 21:35:26 crc kubenswrapper[4753]: I1005 21:35:26.589772 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_ec681529-93c2-4792-8e1e-ccbc696ed9ee/nova-cell1-conductor-conductor/0.log" Oct 05 21:35:26 crc kubenswrapper[4753]: I1005 21:35:26.664831 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_be546d4c-4192-4338-aaf3-2849807daf9d/nova-cell1-novncproxy-novncproxy/0.log" Oct 05 21:35:26 crc kubenswrapper[4753]: I1005 21:35:26.975271 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-kh4xk_db911cf0-3e57-45a3-a1ce-06f5260745b4/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:35:27 crc kubenswrapper[4753]: I1005 21:35:27.114473 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3649c567-6d73-4afe-a1aa-d6621a5cc89f/nova-metadata-log/0.log" Oct 05 21:35:27 crc kubenswrapper[4753]: I1005 21:35:27.677567 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_346a135f-f1af-4968-9c9f-4540f2a71161/nova-scheduler-scheduler/0.log" Oct 05 21:35:27 crc kubenswrapper[4753]: I1005 21:35:27.819781 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c20016b6-f321-4cf2-b09a-d35b96c85805/mysql-bootstrap/0.log" Oct 05 21:35:28 crc kubenswrapper[4753]: I1005 21:35:28.034538 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c20016b6-f321-4cf2-b09a-d35b96c85805/mysql-bootstrap/0.log" Oct 05 21:35:28 crc kubenswrapper[4753]: I1005 21:35:28.078045 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c20016b6-f321-4cf2-b09a-d35b96c85805/galera/0.log" Oct 05 21:35:28 crc kubenswrapper[4753]: I1005 21:35:28.336387 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_81bd134f-0bbd-4cda-b29c-d4d514d4dbe7/mysql-bootstrap/0.log" Oct 05 21:35:29 crc kubenswrapper[4753]: I1005 21:35:29.080857 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_81bd134f-0bbd-4cda-b29c-d4d514d4dbe7/mysql-bootstrap/0.log" Oct 05 21:35:29 crc kubenswrapper[4753]: I1005 21:35:29.111608 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3649c567-6d73-4afe-a1aa-d6621a5cc89f/nova-metadata-metadata/0.log" Oct 05 21:35:29 crc kubenswrapper[4753]: I1005 21:35:29.166690 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_81bd134f-0bbd-4cda-b29c-d4d514d4dbe7/galera/0.log" Oct 05 21:35:29 crc kubenswrapper[4753]: I1005 21:35:29.397448 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ae2325dc-1d53-4605-84d9-c5a341d6c311/openstackclient/0.log" Oct 05 21:35:29 crc kubenswrapper[4753]: I1005 21:35:29.628772 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-7zxq7_61f845cb-9404-421b-b20f-9dee4edd00f8/ovn-controller/0.log" Oct 05 21:35:29 crc kubenswrapper[4753]: I1005 21:35:29.683791 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5xx4p_2ba751b9-51bf-4c3d-8a7a-35af6ebe354f/openstack-network-exporter/0.log" Oct 05 21:35:29 crc kubenswrapper[4753]: I1005 21:35:29.955604 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8krg4_61cd2b9f-f08f-4b47-be04-1d9246a5cbdb/ovsdb-server-init/0.log" Oct 05 21:35:30 crc kubenswrapper[4753]: I1005 21:35:30.619304 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8krg4_61cd2b9f-f08f-4b47-be04-1d9246a5cbdb/ovsdb-server-init/0.log" Oct 05 21:35:30 crc kubenswrapper[4753]: I1005 21:35:30.790200 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8krg4_61cd2b9f-f08f-4b47-be04-1d9246a5cbdb/ovsdb-server/0.log" Oct 05 21:35:30 crc kubenswrapper[4753]: I1005 21:35:30.807097 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8krg4_61cd2b9f-f08f-4b47-be04-1d9246a5cbdb/ovs-vswitchd/0.log" Oct 05 21:35:31 crc kubenswrapper[4753]: I1005 21:35:31.041305 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-btv49_103811f6-8ae0-475f-878b-0c5c615265ee/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:35:31 crc kubenswrapper[4753]: I1005 21:35:31.063959 4753 generic.go:334] "Generic (PLEG): container finished" podID="d843dab7-0131-4469-9d9b-d8fcde0368d3" containerID="bfbd6f27c782f252c6ca2d0ed708bcc0e56d3656d3be02c6d06c915536356c51" exitCode=0 Oct 05 21:35:31 crc kubenswrapper[4753]: I1005 21:35:31.064003 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wr5ml/crc-debug-k8n7s" event={"ID":"d843dab7-0131-4469-9d9b-d8fcde0368d3","Type":"ContainerDied","Data":"bfbd6f27c782f252c6ca2d0ed708bcc0e56d3656d3be02c6d06c915536356c51"} Oct 05 21:35:31 crc kubenswrapper[4753]: I1005 21:35:31.218111 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0410bd72-9899-4174-9258-4efbdc6cd7c8/openstack-network-exporter/0.log" Oct 05 21:35:31 crc kubenswrapper[4753]: I1005 21:35:31.357566 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0410bd72-9899-4174-9258-4efbdc6cd7c8/ovn-northd/0.log" Oct 05 21:35:31 crc kubenswrapper[4753]: I1005 21:35:31.441311 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_19914a64-715e-4a20-82fc-f4e86b8e9e21/openstack-network-exporter/0.log" Oct 05 21:35:31 crc kubenswrapper[4753]: I1005 21:35:31.699250 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_396e74f9-75f0-4643-a011-da8c56174984/openstack-network-exporter/0.log" Oct 05 21:35:31 crc kubenswrapper[4753]: I1005 21:35:31.699644 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_19914a64-715e-4a20-82fc-f4e86b8e9e21/ovsdbserver-nb/0.log" Oct 05 21:35:31 crc kubenswrapper[4753]: I1005 21:35:31.969536 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_396e74f9-75f0-4643-a011-da8c56174984/ovsdbserver-sb/0.log" Oct 05 21:35:32 crc kubenswrapper[4753]: I1005 21:35:32.062360 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-78454fb4-ktvqp_131ce515-ac42-4446-b075-5e50254e6686/placement-api/0.log" Oct 05 21:35:32 crc kubenswrapper[4753]: I1005 21:35:32.178847 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr5ml/crc-debug-k8n7s" Oct 05 21:35:32 crc kubenswrapper[4753]: I1005 21:35:32.225444 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wr5ml/crc-debug-k8n7s"] Oct 05 21:35:32 crc kubenswrapper[4753]: I1005 21:35:32.238048 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wr5ml/crc-debug-k8n7s"] Oct 05 21:35:32 crc kubenswrapper[4753]: I1005 21:35:32.280385 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d843dab7-0131-4469-9d9b-d8fcde0368d3-host\") pod \"d843dab7-0131-4469-9d9b-d8fcde0368d3\" (UID: \"d843dab7-0131-4469-9d9b-d8fcde0368d3\") " Oct 05 21:35:32 crc kubenswrapper[4753]: I1005 21:35:32.280517 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlmtw\" (UniqueName: \"kubernetes.io/projected/d843dab7-0131-4469-9d9b-d8fcde0368d3-kube-api-access-zlmtw\") pod \"d843dab7-0131-4469-9d9b-d8fcde0368d3\" (UID: \"d843dab7-0131-4469-9d9b-d8fcde0368d3\") " Oct 05 21:35:32 crc kubenswrapper[4753]: I1005 21:35:32.285253 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d843dab7-0131-4469-9d9b-d8fcde0368d3-host" (OuterVolumeSpecName: "host") pod "d843dab7-0131-4469-9d9b-d8fcde0368d3" (UID: "d843dab7-0131-4469-9d9b-d8fcde0368d3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 21:35:32 crc kubenswrapper[4753]: I1005 21:35:32.291347 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d843dab7-0131-4469-9d9b-d8fcde0368d3-kube-api-access-zlmtw" (OuterVolumeSpecName: "kube-api-access-zlmtw") pod "d843dab7-0131-4469-9d9b-d8fcde0368d3" (UID: "d843dab7-0131-4469-9d9b-d8fcde0368d3"). InnerVolumeSpecName "kube-api-access-zlmtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:35:32 crc kubenswrapper[4753]: I1005 21:35:32.393019 4753 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d843dab7-0131-4469-9d9b-d8fcde0368d3-host\") on node \"crc\" DevicePath \"\"" Oct 05 21:35:32 crc kubenswrapper[4753]: I1005 21:35:32.393051 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlmtw\" (UniqueName: \"kubernetes.io/projected/d843dab7-0131-4469-9d9b-d8fcde0368d3-kube-api-access-zlmtw\") on node \"crc\" DevicePath \"\"" Oct 05 21:35:32 crc kubenswrapper[4753]: I1005 21:35:32.398990 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-78454fb4-ktvqp_131ce515-ac42-4446-b075-5e50254e6686/placement-log/0.log" Oct 05 21:35:32 crc kubenswrapper[4753]: I1005 21:35:32.522962 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0022b5ba-c84b-4ee1-84a5-8e04d7c4d330/setup-container/0.log" Oct 05 21:35:32 crc kubenswrapper[4753]: I1005 21:35:32.731913 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0022b5ba-c84b-4ee1-84a5-8e04d7c4d330/setup-container/0.log" Oct 05 21:35:32 crc kubenswrapper[4753]: I1005 21:35:32.757324 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0022b5ba-c84b-4ee1-84a5-8e04d7c4d330/rabbitmq/0.log" Oct 05 21:35:32 crc kubenswrapper[4753]: I1005 21:35:32.942953 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_468c6dc5-e196-4084-9211-d2b06253832d/setup-container/0.log" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.087516 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c00f879d4132919e1377a9328f59558e1579722ff523e300fdb140d98ef39b5f" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.087730 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr5ml/crc-debug-k8n7s" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.232492 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_468c6dc5-e196-4084-9211-d2b06253832d/setup-container/0.log" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.258969 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_468c6dc5-e196-4084-9211-d2b06253832d/rabbitmq/0.log" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.477507 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wr5ml/crc-debug-5pswk"] Oct 05 21:35:33 crc kubenswrapper[4753]: E1005 21:35:33.477850 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d843dab7-0131-4469-9d9b-d8fcde0368d3" containerName="container-00" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.477866 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d843dab7-0131-4469-9d9b-d8fcde0368d3" containerName="container-00" Oct 05 21:35:33 crc kubenswrapper[4753]: E1005 21:35:33.477902 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e4b4643-7e36-4218-8fd1-c819eb806de1" containerName="registry-server" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.477908 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e4b4643-7e36-4218-8fd1-c819eb806de1" containerName="registry-server" Oct 05 21:35:33 crc kubenswrapper[4753]: E1005 21:35:33.477918 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e4b4643-7e36-4218-8fd1-c819eb806de1" containerName="extract-content" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.477924 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e4b4643-7e36-4218-8fd1-c819eb806de1" containerName="extract-content" Oct 05 21:35:33 crc kubenswrapper[4753]: E1005 21:35:33.477937 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e4b4643-7e36-4218-8fd1-c819eb806de1" containerName="extract-utilities" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.477943 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e4b4643-7e36-4218-8fd1-c819eb806de1" containerName="extract-utilities" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.478096 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d843dab7-0131-4469-9d9b-d8fcde0368d3" containerName="container-00" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.478118 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e4b4643-7e36-4218-8fd1-c819eb806de1" containerName="registry-server" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.479841 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr5ml/crc-debug-5pswk" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.488889 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wr5ml"/"default-dockercfg-x66q6" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.523401 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4137a31-2241-4c1f-968a-041a7f378afc-host\") pod \"crc-debug-5pswk\" (UID: \"f4137a31-2241-4c1f-968a-041a7f378afc\") " pod="openshift-must-gather-wr5ml/crc-debug-5pswk" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.523797 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz7cq\" (UniqueName: \"kubernetes.io/projected/f4137a31-2241-4c1f-968a-041a7f378afc-kube-api-access-fz7cq\") pod \"crc-debug-5pswk\" (UID: \"f4137a31-2241-4c1f-968a-041a7f378afc\") " pod="openshift-must-gather-wr5ml/crc-debug-5pswk" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.525467 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-d2r9s_dd50c8ec-c247-4691-9c6d-6c72c1e89227/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.625910 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz7cq\" (UniqueName: \"kubernetes.io/projected/f4137a31-2241-4c1f-968a-041a7f378afc-kube-api-access-fz7cq\") pod \"crc-debug-5pswk\" (UID: \"f4137a31-2241-4c1f-968a-041a7f378afc\") " pod="openshift-must-gather-wr5ml/crc-debug-5pswk" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.625980 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4137a31-2241-4c1f-968a-041a7f378afc-host\") pod \"crc-debug-5pswk\" (UID: \"f4137a31-2241-4c1f-968a-041a7f378afc\") " pod="openshift-must-gather-wr5ml/crc-debug-5pswk" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.626088 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4137a31-2241-4c1f-968a-041a7f378afc-host\") pod \"crc-debug-5pswk\" (UID: \"f4137a31-2241-4c1f-968a-041a7f378afc\") " pod="openshift-must-gather-wr5ml/crc-debug-5pswk" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.652950 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-n6kjk_91285735-785c-4889-9913-bb3e58ffed5f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.666588 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz7cq\" (UniqueName: \"kubernetes.io/projected/f4137a31-2241-4c1f-968a-041a7f378afc-kube-api-access-fz7cq\") pod \"crc-debug-5pswk\" (UID: \"f4137a31-2241-4c1f-968a-041a7f378afc\") " pod="openshift-must-gather-wr5ml/crc-debug-5pswk" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.811179 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr5ml/crc-debug-5pswk" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.874964 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d843dab7-0131-4469-9d9b-d8fcde0368d3" path="/var/lib/kubelet/pods/d843dab7-0131-4469-9d9b-d8fcde0368d3/volumes" Oct 05 21:35:33 crc kubenswrapper[4753]: I1005 21:35:33.934308 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-vqmtg_1073a302-b108-4caa-aa77-78d64fd8f169/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:35:34 crc kubenswrapper[4753]: I1005 21:35:34.103198 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wr5ml/crc-debug-5pswk" event={"ID":"f4137a31-2241-4c1f-968a-041a7f378afc","Type":"ContainerStarted","Data":"4a68008b4af85268314481db34acc0713653ef2eb25a5e9959064fe99f991dea"} Oct 05 21:35:34 crc kubenswrapper[4753]: I1005 21:35:34.107931 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-n4pww_5d24d938-36bb-4d7b-94e6-f0332f50a71a/ssh-known-hosts-edpm-deployment/0.log" Oct 05 21:35:34 crc kubenswrapper[4753]: I1005 21:35:34.231848 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_989178f4-ef23-49c1-88f8-10babb448a68/tempest-tests-tempest-tests-runner/0.log" Oct 05 21:35:34 crc kubenswrapper[4753]: I1005 21:35:34.489841 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:35:34 crc kubenswrapper[4753]: I1005 21:35:34.490332 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:35:34 crc kubenswrapper[4753]: I1005 21:35:34.522418 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_cf30bb5c-e675-411f-b50d-77dff81c83af/test-operator-logs-container/0.log" Oct 05 21:35:34 crc kubenswrapper[4753]: I1005 21:35:34.708675 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-xs45v_e9938f80-4e3c-476e-bd1d-11e1646d9176/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 05 21:35:35 crc kubenswrapper[4753]: I1005 21:35:35.111321 4753 generic.go:334] "Generic (PLEG): container finished" podID="f4137a31-2241-4c1f-968a-041a7f378afc" containerID="9e8654d4f1020fa76d95fc3d1f4fd333608448e66dd68879dff4da682461bec1" exitCode=0 Oct 05 21:35:35 crc kubenswrapper[4753]: I1005 21:35:35.111377 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wr5ml/crc-debug-5pswk" event={"ID":"f4137a31-2241-4c1f-968a-041a7f378afc","Type":"ContainerDied","Data":"9e8654d4f1020fa76d95fc3d1f4fd333608448e66dd68879dff4da682461bec1"} Oct 05 21:35:36 crc kubenswrapper[4753]: I1005 21:35:36.230935 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr5ml/crc-debug-5pswk" Oct 05 21:35:36 crc kubenswrapper[4753]: I1005 21:35:36.272785 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz7cq\" (UniqueName: \"kubernetes.io/projected/f4137a31-2241-4c1f-968a-041a7f378afc-kube-api-access-fz7cq\") pod \"f4137a31-2241-4c1f-968a-041a7f378afc\" (UID: \"f4137a31-2241-4c1f-968a-041a7f378afc\") " Oct 05 21:35:36 crc kubenswrapper[4753]: I1005 21:35:36.272821 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4137a31-2241-4c1f-968a-041a7f378afc-host\") pod \"f4137a31-2241-4c1f-968a-041a7f378afc\" (UID: \"f4137a31-2241-4c1f-968a-041a7f378afc\") " Oct 05 21:35:36 crc kubenswrapper[4753]: I1005 21:35:36.273252 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4137a31-2241-4c1f-968a-041a7f378afc-host" (OuterVolumeSpecName: "host") pod "f4137a31-2241-4c1f-968a-041a7f378afc" (UID: "f4137a31-2241-4c1f-968a-041a7f378afc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 21:35:36 crc kubenswrapper[4753]: I1005 21:35:36.306761 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4137a31-2241-4c1f-968a-041a7f378afc-kube-api-access-fz7cq" (OuterVolumeSpecName: "kube-api-access-fz7cq") pod "f4137a31-2241-4c1f-968a-041a7f378afc" (UID: "f4137a31-2241-4c1f-968a-041a7f378afc"). InnerVolumeSpecName "kube-api-access-fz7cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:35:36 crc kubenswrapper[4753]: I1005 21:35:36.374343 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz7cq\" (UniqueName: \"kubernetes.io/projected/f4137a31-2241-4c1f-968a-041a7f378afc-kube-api-access-fz7cq\") on node \"crc\" DevicePath \"\"" Oct 05 21:35:36 crc kubenswrapper[4753]: I1005 21:35:36.374369 4753 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4137a31-2241-4c1f-968a-041a7f378afc-host\") on node \"crc\" DevicePath \"\"" Oct 05 21:35:37 crc kubenswrapper[4753]: I1005 21:35:37.131158 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wr5ml/crc-debug-5pswk" event={"ID":"f4137a31-2241-4c1f-968a-041a7f378afc","Type":"ContainerDied","Data":"4a68008b4af85268314481db34acc0713653ef2eb25a5e9959064fe99f991dea"} Oct 05 21:35:37 crc kubenswrapper[4753]: I1005 21:35:37.131548 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a68008b4af85268314481db34acc0713653ef2eb25a5e9959064fe99f991dea" Oct 05 21:35:37 crc kubenswrapper[4753]: I1005 21:35:37.131638 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr5ml/crc-debug-5pswk" Oct 05 21:35:44 crc kubenswrapper[4753]: I1005 21:35:44.228619 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wr5ml/crc-debug-5pswk"] Oct 05 21:35:44 crc kubenswrapper[4753]: I1005 21:35:44.239717 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wr5ml/crc-debug-5pswk"] Oct 05 21:35:45 crc kubenswrapper[4753]: I1005 21:35:45.399874 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wr5ml/crc-debug-tppc7"] Oct 05 21:35:45 crc kubenswrapper[4753]: E1005 21:35:45.400235 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4137a31-2241-4c1f-968a-041a7f378afc" containerName="container-00" Oct 05 21:35:45 crc kubenswrapper[4753]: I1005 21:35:45.400246 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4137a31-2241-4c1f-968a-041a7f378afc" containerName="container-00" Oct 05 21:35:45 crc kubenswrapper[4753]: I1005 21:35:45.400421 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4137a31-2241-4c1f-968a-041a7f378afc" containerName="container-00" Oct 05 21:35:45 crc kubenswrapper[4753]: I1005 21:35:45.401013 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr5ml/crc-debug-tppc7" Oct 05 21:35:45 crc kubenswrapper[4753]: I1005 21:35:45.402963 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wr5ml"/"default-dockercfg-x66q6" Oct 05 21:35:45 crc kubenswrapper[4753]: I1005 21:35:45.492420 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlbcf\" (UniqueName: \"kubernetes.io/projected/99efa172-6d82-4942-b063-52dbdc3645a4-kube-api-access-hlbcf\") pod \"crc-debug-tppc7\" (UID: \"99efa172-6d82-4942-b063-52dbdc3645a4\") " pod="openshift-must-gather-wr5ml/crc-debug-tppc7" Oct 05 21:35:45 crc kubenswrapper[4753]: I1005 21:35:45.492579 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99efa172-6d82-4942-b063-52dbdc3645a4-host\") pod \"crc-debug-tppc7\" (UID: \"99efa172-6d82-4942-b063-52dbdc3645a4\") " pod="openshift-must-gather-wr5ml/crc-debug-tppc7" Oct 05 21:35:45 crc kubenswrapper[4753]: I1005 21:35:45.593774 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99efa172-6d82-4942-b063-52dbdc3645a4-host\") pod \"crc-debug-tppc7\" (UID: \"99efa172-6d82-4942-b063-52dbdc3645a4\") " pod="openshift-must-gather-wr5ml/crc-debug-tppc7" Oct 05 21:35:45 crc kubenswrapper[4753]: I1005 21:35:45.593863 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlbcf\" (UniqueName: \"kubernetes.io/projected/99efa172-6d82-4942-b063-52dbdc3645a4-kube-api-access-hlbcf\") pod \"crc-debug-tppc7\" (UID: \"99efa172-6d82-4942-b063-52dbdc3645a4\") " pod="openshift-must-gather-wr5ml/crc-debug-tppc7" Oct 05 21:35:45 crc kubenswrapper[4753]: I1005 21:35:45.593883 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99efa172-6d82-4942-b063-52dbdc3645a4-host\") pod \"crc-debug-tppc7\" (UID: \"99efa172-6d82-4942-b063-52dbdc3645a4\") " pod="openshift-must-gather-wr5ml/crc-debug-tppc7" Oct 05 21:35:45 crc kubenswrapper[4753]: I1005 21:35:45.648775 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlbcf\" (UniqueName: \"kubernetes.io/projected/99efa172-6d82-4942-b063-52dbdc3645a4-kube-api-access-hlbcf\") pod \"crc-debug-tppc7\" (UID: \"99efa172-6d82-4942-b063-52dbdc3645a4\") " pod="openshift-must-gather-wr5ml/crc-debug-tppc7" Oct 05 21:35:45 crc kubenswrapper[4753]: I1005 21:35:45.724932 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr5ml/crc-debug-tppc7" Oct 05 21:35:45 crc kubenswrapper[4753]: I1005 21:35:45.894114 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4137a31-2241-4c1f-968a-041a7f378afc" path="/var/lib/kubelet/pods/f4137a31-2241-4c1f-968a-041a7f378afc/volumes" Oct 05 21:35:46 crc kubenswrapper[4753]: I1005 21:35:46.203861 4753 generic.go:334] "Generic (PLEG): container finished" podID="99efa172-6d82-4942-b063-52dbdc3645a4" containerID="2d9ef25f01a2ef35bbfeabfe325693f0ed11687918d28c870351ed5bfc919982" exitCode=0 Oct 05 21:35:46 crc kubenswrapper[4753]: I1005 21:35:46.203911 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wr5ml/crc-debug-tppc7" event={"ID":"99efa172-6d82-4942-b063-52dbdc3645a4","Type":"ContainerDied","Data":"2d9ef25f01a2ef35bbfeabfe325693f0ed11687918d28c870351ed5bfc919982"} Oct 05 21:35:46 crc kubenswrapper[4753]: I1005 21:35:46.203940 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wr5ml/crc-debug-tppc7" event={"ID":"99efa172-6d82-4942-b063-52dbdc3645a4","Type":"ContainerStarted","Data":"3a39b1687ed8a363fb1713a0f5bc12e10a7e8ca8871c7788a715cd9a71827c28"} Oct 05 21:35:46 crc kubenswrapper[4753]: I1005 21:35:46.242512 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wr5ml/crc-debug-tppc7"] Oct 05 21:35:46 crc kubenswrapper[4753]: I1005 21:35:46.252228 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wr5ml/crc-debug-tppc7"] Oct 05 21:35:47 crc kubenswrapper[4753]: I1005 21:35:47.313948 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr5ml/crc-debug-tppc7" Oct 05 21:35:47 crc kubenswrapper[4753]: I1005 21:35:47.428495 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99efa172-6d82-4942-b063-52dbdc3645a4-host\") pod \"99efa172-6d82-4942-b063-52dbdc3645a4\" (UID: \"99efa172-6d82-4942-b063-52dbdc3645a4\") " Oct 05 21:35:47 crc kubenswrapper[4753]: I1005 21:35:47.428775 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlbcf\" (UniqueName: \"kubernetes.io/projected/99efa172-6d82-4942-b063-52dbdc3645a4-kube-api-access-hlbcf\") pod \"99efa172-6d82-4942-b063-52dbdc3645a4\" (UID: \"99efa172-6d82-4942-b063-52dbdc3645a4\") " Oct 05 21:35:47 crc kubenswrapper[4753]: I1005 21:35:47.429909 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99efa172-6d82-4942-b063-52dbdc3645a4-host" (OuterVolumeSpecName: "host") pod "99efa172-6d82-4942-b063-52dbdc3645a4" (UID: "99efa172-6d82-4942-b063-52dbdc3645a4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 05 21:35:47 crc kubenswrapper[4753]: I1005 21:35:47.441308 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99efa172-6d82-4942-b063-52dbdc3645a4-kube-api-access-hlbcf" (OuterVolumeSpecName: "kube-api-access-hlbcf") pod "99efa172-6d82-4942-b063-52dbdc3645a4" (UID: "99efa172-6d82-4942-b063-52dbdc3645a4"). InnerVolumeSpecName "kube-api-access-hlbcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:35:47 crc kubenswrapper[4753]: I1005 21:35:47.531266 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlbcf\" (UniqueName: \"kubernetes.io/projected/99efa172-6d82-4942-b063-52dbdc3645a4-kube-api-access-hlbcf\") on node \"crc\" DevicePath \"\"" Oct 05 21:35:47 crc kubenswrapper[4753]: I1005 21:35:47.531297 4753 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99efa172-6d82-4942-b063-52dbdc3645a4-host\") on node \"crc\" DevicePath \"\"" Oct 05 21:35:47 crc kubenswrapper[4753]: I1005 21:35:47.871914 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99efa172-6d82-4942-b063-52dbdc3645a4" path="/var/lib/kubelet/pods/99efa172-6d82-4942-b063-52dbdc3645a4/volumes" Oct 05 21:35:48 crc kubenswrapper[4753]: I1005 21:35:48.224190 4753 scope.go:117] "RemoveContainer" containerID="2d9ef25f01a2ef35bbfeabfe325693f0ed11687918d28c870351ed5bfc919982" Oct 05 21:35:48 crc kubenswrapper[4753]: I1005 21:35:48.224582 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr5ml/crc-debug-tppc7" Oct 05 21:35:50 crc kubenswrapper[4753]: I1005 21:35:50.678639 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_5091b95a-0011-45bb-b4b8-be273f03f7b4/memcached/0.log" Oct 05 21:36:04 crc kubenswrapper[4753]: I1005 21:36:04.490477 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:36:04 crc kubenswrapper[4753]: I1005 21:36:04.491028 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:36:04 crc kubenswrapper[4753]: I1005 21:36:04.491074 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 21:36:04 crc kubenswrapper[4753]: I1005 21:36:04.491860 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5905e260f85412f537154f743daa6fab4914430b1ac727576dca9ac43782f6d3"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 21:36:04 crc kubenswrapper[4753]: I1005 21:36:04.491914 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://5905e260f85412f537154f743daa6fab4914430b1ac727576dca9ac43782f6d3" gracePeriod=600 Oct 05 21:36:05 crc kubenswrapper[4753]: I1005 21:36:05.399084 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="5905e260f85412f537154f743daa6fab4914430b1ac727576dca9ac43782f6d3" exitCode=0 Oct 05 21:36:05 crc kubenswrapper[4753]: I1005 21:36:05.399152 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"5905e260f85412f537154f743daa6fab4914430b1ac727576dca9ac43782f6d3"} Oct 05 21:36:05 crc kubenswrapper[4753]: I1005 21:36:05.399718 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerStarted","Data":"596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc"} Oct 05 21:36:05 crc kubenswrapper[4753]: I1005 21:36:05.399742 4753 scope.go:117] "RemoveContainer" containerID="814b21fbc6ba0d81b50635c097621f0d634bdf9f1d17990301a3c8a4dcc44691" Oct 05 21:36:06 crc kubenswrapper[4753]: I1005 21:36:06.509020 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574_95cb6a2f-3fb4-43d6-b342-c17059bf0e73/util/0.log" Oct 05 21:36:06 crc kubenswrapper[4753]: I1005 21:36:06.718234 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574_95cb6a2f-3fb4-43d6-b342-c17059bf0e73/pull/0.log" Oct 05 21:36:06 crc kubenswrapper[4753]: I1005 21:36:06.767007 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574_95cb6a2f-3fb4-43d6-b342-c17059bf0e73/util/0.log" Oct 05 21:36:06 crc kubenswrapper[4753]: I1005 21:36:06.794523 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574_95cb6a2f-3fb4-43d6-b342-c17059bf0e73/pull/0.log" Oct 05 21:36:07 crc kubenswrapper[4753]: I1005 21:36:07.255688 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574_95cb6a2f-3fb4-43d6-b342-c17059bf0e73/extract/0.log" Oct 05 21:36:07 crc kubenswrapper[4753]: I1005 21:36:07.305046 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574_95cb6a2f-3fb4-43d6-b342-c17059bf0e73/util/0.log" Oct 05 21:36:07 crc kubenswrapper[4753]: I1005 21:36:07.376474 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0da5b2e9368362304c733e302da704023e73b0b2df8ed109170f4705a8nq574_95cb6a2f-3fb4-43d6-b342-c17059bf0e73/pull/0.log" Oct 05 21:36:07 crc kubenswrapper[4753]: I1005 21:36:07.470575 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5b974f6766-wsbjp_12ff014d-81e6-4a9e-8197-e28fbfc4a06e/kube-rbac-proxy/0.log" Oct 05 21:36:07 crc kubenswrapper[4753]: I1005 21:36:07.603861 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5b974f6766-wsbjp_12ff014d-81e6-4a9e-8197-e28fbfc4a06e/manager/0.log" Oct 05 21:36:07 crc kubenswrapper[4753]: I1005 21:36:07.649912 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-84bd8f6848-p5g48_64896158-a10b-4fd9-b232-5ba3fa647a02/kube-rbac-proxy/0.log" Oct 05 21:36:07 crc kubenswrapper[4753]: I1005 21:36:07.754547 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-84bd8f6848-p5g48_64896158-a10b-4fd9-b232-5ba3fa647a02/manager/0.log" Oct 05 21:36:07 crc kubenswrapper[4753]: I1005 21:36:07.923490 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-4wqxw_9286136d-f0a7-4488-b346-2b3ea3ab81da/kube-rbac-proxy/0.log" Oct 05 21:36:07 crc kubenswrapper[4753]: I1005 21:36:07.958908 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-4wqxw_9286136d-f0a7-4488-b346-2b3ea3ab81da/manager/0.log" Oct 05 21:36:08 crc kubenswrapper[4753]: I1005 21:36:08.092185 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-698456cdc6-tnt6j_d8c88aaa-c54b-4f65-be07-61e23d5a5cd4/kube-rbac-proxy/0.log" Oct 05 21:36:08 crc kubenswrapper[4753]: I1005 21:36:08.181236 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-698456cdc6-tnt6j_d8c88aaa-c54b-4f65-be07-61e23d5a5cd4/manager/0.log" Oct 05 21:36:08 crc kubenswrapper[4753]: I1005 21:36:08.294239 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5c497dbdb-txd4b_5b8831b7-9250-4ec8-b732-2db04e507cfe/kube-rbac-proxy/0.log" Oct 05 21:36:08 crc kubenswrapper[4753]: I1005 21:36:08.310953 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5c497dbdb-txd4b_5b8831b7-9250-4ec8-b732-2db04e507cfe/manager/0.log" Oct 05 21:36:08 crc kubenswrapper[4753]: I1005 21:36:08.491462 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6675647785-dqfcj_885f705b-599d-41fe-92cf-ffd000ad5e6e/kube-rbac-proxy/0.log" Oct 05 21:36:08 crc kubenswrapper[4753]: I1005 21:36:08.517071 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6675647785-dqfcj_885f705b-599d-41fe-92cf-ffd000ad5e6e/manager/0.log" Oct 05 21:36:08 crc kubenswrapper[4753]: I1005 21:36:08.821512 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84788b6bc5-vksxs_f241e98d-8f7c-492a-a4bc-988dc78b6449/kube-rbac-proxy/0.log" Oct 05 21:36:08 crc kubenswrapper[4753]: I1005 21:36:08.978836 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f5894c49f-ct2l6_266b0921-1164-46bc-9e78-986f5ded5943/kube-rbac-proxy/0.log" Oct 05 21:36:08 crc kubenswrapper[4753]: I1005 21:36:08.984194 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84788b6bc5-vksxs_f241e98d-8f7c-492a-a4bc-988dc78b6449/manager/0.log" Oct 05 21:36:09 crc kubenswrapper[4753]: I1005 21:36:09.193124 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-57c9cdcf57-9kjpf_dd3487ac-89f8-40f1-967e-71f7fada0fe1/kube-rbac-proxy/0.log" Oct 05 21:36:09 crc kubenswrapper[4753]: I1005 21:36:09.239091 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-57c9cdcf57-9kjpf_dd3487ac-89f8-40f1-967e-71f7fada0fe1/manager/0.log" Oct 05 21:36:09 crc kubenswrapper[4753]: I1005 21:36:09.258781 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f5894c49f-ct2l6_266b0921-1164-46bc-9e78-986f5ded5943/manager/0.log" Oct 05 21:36:09 crc kubenswrapper[4753]: I1005 21:36:09.453890 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7cb48dbc-f4dp5_0c5e8f9b-e10e-436b-ae33-07a7350f02a1/kube-rbac-proxy/0.log" Oct 05 21:36:09 crc kubenswrapper[4753]: I1005 21:36:09.506931 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7cb48dbc-f4dp5_0c5e8f9b-e10e-436b-ae33-07a7350f02a1/manager/0.log" Oct 05 21:36:09 crc kubenswrapper[4753]: I1005 21:36:09.585386 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-d6c9dc5bc-rlsgp_00eefbb7-989e-478d-aad3-ff4d236168f2/kube-rbac-proxy/0.log" Oct 05 21:36:09 crc kubenswrapper[4753]: I1005 21:36:09.731101 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-d6c9dc5bc-rlsgp_00eefbb7-989e-478d-aad3-ff4d236168f2/manager/0.log" Oct 05 21:36:09 crc kubenswrapper[4753]: I1005 21:36:09.766843 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-69b956fbf6-vtgfd_9c8b9aa1-e15e-475d-a02e-56b430d50bd1/kube-rbac-proxy/0.log" Oct 05 21:36:09 crc kubenswrapper[4753]: I1005 21:36:09.829242 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-69b956fbf6-vtgfd_9c8b9aa1-e15e-475d-a02e-56b430d50bd1/manager/0.log" Oct 05 21:36:09 crc kubenswrapper[4753]: I1005 21:36:09.961522 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6c9b57c67-cp4qf_3e514d87-9323-4c3b-a372-60e5c65fa731/kube-rbac-proxy/0.log" Oct 05 21:36:10 crc kubenswrapper[4753]: I1005 21:36:10.027531 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6c9b57c67-cp4qf_3e514d87-9323-4c3b-a372-60e5c65fa731/manager/0.log" Oct 05 21:36:10 crc kubenswrapper[4753]: I1005 21:36:10.057897 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f59f9d8-htstg_d58d3fcd-368b-4d73-8c29-a181f3bdddee/kube-rbac-proxy/0.log" Oct 05 21:36:10 crc kubenswrapper[4753]: I1005 21:36:10.150671 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f59f9d8-htstg_d58d3fcd-368b-4d73-8c29-a181f3bdddee/manager/0.log" Oct 05 21:36:10 crc kubenswrapper[4753]: I1005 21:36:10.293869 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm_28d42154-af7b-440b-af1b-2ef50ee9edca/kube-rbac-proxy/0.log" Oct 05 21:36:10 crc kubenswrapper[4753]: I1005 21:36:10.307691 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-66cc85b5d52x5fm_28d42154-af7b-440b-af1b-2ef50ee9edca/manager/0.log" Oct 05 21:36:10 crc kubenswrapper[4753]: I1005 21:36:10.403445 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7cfc658b9-4t94r_d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5/kube-rbac-proxy/0.log" Oct 05 21:36:10 crc kubenswrapper[4753]: I1005 21:36:10.881515 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-677d5bb784-5l45z_22ffe795-4cc5-4c86-9ae6-04999586c7de/operator/0.log" Oct 05 21:36:10 crc kubenswrapper[4753]: I1005 21:36:10.895921 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-677d5bb784-5l45z_22ffe795-4cc5-4c86-9ae6-04999586c7de/kube-rbac-proxy/0.log" Oct 05 21:36:11 crc kubenswrapper[4753]: I1005 21:36:11.157980 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-jswp7_a7f848da-2bdd-4f76-bc85-94cbb95bd680/registry-server/0.log" Oct 05 21:36:11 crc kubenswrapper[4753]: I1005 21:36:11.266077 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-c968bb45-xsjn9_8ed6d37f-576a-4f14-a98a-65193559d7de/kube-rbac-proxy/0.log" Oct 05 21:36:11 crc kubenswrapper[4753]: I1005 21:36:11.445599 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-c968bb45-xsjn9_8ed6d37f-576a-4f14-a98a-65193559d7de/manager/0.log" Oct 05 21:36:11 crc kubenswrapper[4753]: I1005 21:36:11.602449 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-66f6d6849b-9dsbr_ff1a796b-8cc7-4c73-842f-7b4a1170b56f/manager/0.log" Oct 05 21:36:11 crc kubenswrapper[4753]: I1005 21:36:11.649495 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-66f6d6849b-9dsbr_ff1a796b-8cc7-4c73-842f-7b4a1170b56f/kube-rbac-proxy/0.log" Oct 05 21:36:11 crc kubenswrapper[4753]: I1005 21:36:11.879064 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-k7kxr_2d0279fb-be4d-47a0-83c7-4452c7b13a5b/operator/0.log" Oct 05 21:36:11 crc kubenswrapper[4753]: I1005 21:36:11.948233 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7cfc658b9-4t94r_d8bad9ab-af81-4b7c-bf9a-cf1fa60f9be5/manager/0.log" Oct 05 21:36:12 crc kubenswrapper[4753]: I1005 21:36:12.061648 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-4zhzq_995eda80-87fa-4160-b04e-679668f8d910/kube-rbac-proxy/0.log" Oct 05 21:36:12 crc kubenswrapper[4753]: I1005 21:36:12.089806 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-4zhzq_995eda80-87fa-4160-b04e-679668f8d910/manager/0.log" Oct 05 21:36:12 crc kubenswrapper[4753]: I1005 21:36:12.247513 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-f589c7597-58qhn_82db7b73-2afb-4063-9d64-fc3fa5559e93/manager/0.log" Oct 05 21:36:12 crc kubenswrapper[4753]: I1005 21:36:12.254819 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-f589c7597-58qhn_82db7b73-2afb-4063-9d64-fc3fa5559e93/kube-rbac-proxy/0.log" Oct 05 21:36:12 crc kubenswrapper[4753]: I1005 21:36:12.270156 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb6dcddc-zqgbl_8dd994e1-cb87-48dc-b844-2bdbc8b6e48d/kube-rbac-proxy/0.log" Oct 05 21:36:12 crc kubenswrapper[4753]: I1005 21:36:12.389871 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb6dcddc-zqgbl_8dd994e1-cb87-48dc-b844-2bdbc8b6e48d/manager/0.log" Oct 05 21:36:12 crc kubenswrapper[4753]: I1005 21:36:12.430703 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5d98cc5575-t99zs_96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3/kube-rbac-proxy/0.log" Oct 05 21:36:12 crc kubenswrapper[4753]: I1005 21:36:12.544381 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5d98cc5575-t99zs_96b277ad-ceca-4e55-9d8e-b2e91bfd1cc3/manager/0.log" Oct 05 21:36:32 crc kubenswrapper[4753]: I1005 21:36:32.155187 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-h7rzp_7da48090-042e-4fef-afdf-9e6e54a89fe2/control-plane-machine-set-operator/0.log" Oct 05 21:36:32 crc kubenswrapper[4753]: I1005 21:36:32.318044 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-d8b6f_22db54ee-7d52-475e-a824-9e563b2920e8/kube-rbac-proxy/0.log" Oct 05 21:36:32 crc kubenswrapper[4753]: I1005 21:36:32.366650 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-d8b6f_22db54ee-7d52-475e-a824-9e563b2920e8/machine-api-operator/0.log" Oct 05 21:36:45 crc kubenswrapper[4753]: I1005 21:36:45.983371 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-54vmd_b8aa872e-b15b-458f-8bf4-0057a25d5d43/cert-manager-controller/0.log" Oct 05 21:36:46 crc kubenswrapper[4753]: I1005 21:36:46.153680 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-shkg2_183d1891-ba1d-4ce0-83bd-9a547d099416/cert-manager-webhook/0.log" Oct 05 21:36:46 crc kubenswrapper[4753]: I1005 21:36:46.160257 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-qvslf_3749909c-e0a6-4a84-8e16-e8d104f8bb29/cert-manager-cainjector/0.log" Oct 05 21:36:58 crc kubenswrapper[4753]: I1005 21:36:58.180563 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-md9qn_51e12003-95ee-4af9-a340-5928bb9d7ae7/nmstate-console-plugin/0.log" Oct 05 21:36:58 crc kubenswrapper[4753]: I1005 21:36:58.331415 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5mhrc_f5c8b381-2e86-47f5-86da-86db3c2aa511/nmstate-handler/0.log" Oct 05 21:36:58 crc kubenswrapper[4753]: I1005 21:36:58.399790 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-q2rfx_d825ed58-9313-4cf2-a923-53e6d809fa60/nmstate-metrics/0.log" Oct 05 21:36:58 crc kubenswrapper[4753]: I1005 21:36:58.400725 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-q2rfx_d825ed58-9313-4cf2-a923-53e6d809fa60/kube-rbac-proxy/0.log" Oct 05 21:36:58 crc kubenswrapper[4753]: I1005 21:36:58.613929 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-l87q6_ae5b861a-0212-4c4c-944b-7d6f3187a5a8/nmstate-operator/0.log" Oct 05 21:36:58 crc kubenswrapper[4753]: I1005 21:36:58.644565 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-k58rx_1b16b6bd-7d95-49c5-a2a8-87b0018e30c7/nmstate-webhook/0.log" Oct 05 21:37:13 crc kubenswrapper[4753]: I1005 21:37:13.358668 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-bbs9p_e8a9ee2c-3d45-4169-b558-9c27e68cc25f/kube-rbac-proxy/0.log" Oct 05 21:37:13 crc kubenswrapper[4753]: I1005 21:37:13.462834 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-bbs9p_e8a9ee2c-3d45-4169-b558-9c27e68cc25f/controller/0.log" Oct 05 21:37:13 crc kubenswrapper[4753]: I1005 21:37:13.561604 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-frr-files/0.log" Oct 05 21:37:13 crc kubenswrapper[4753]: I1005 21:37:13.772462 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-reloader/0.log" Oct 05 21:37:13 crc kubenswrapper[4753]: I1005 21:37:13.810448 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-metrics/0.log" Oct 05 21:37:13 crc kubenswrapper[4753]: I1005 21:37:13.812301 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-reloader/0.log" Oct 05 21:37:13 crc kubenswrapper[4753]: I1005 21:37:13.816308 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-frr-files/0.log" Oct 05 21:37:14 crc kubenswrapper[4753]: I1005 21:37:14.016497 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-frr-files/0.log" Oct 05 21:37:14 crc kubenswrapper[4753]: I1005 21:37:14.046717 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-metrics/0.log" Oct 05 21:37:14 crc kubenswrapper[4753]: I1005 21:37:14.065105 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-reloader/0.log" Oct 05 21:37:14 crc kubenswrapper[4753]: I1005 21:37:14.069211 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-metrics/0.log" Oct 05 21:37:14 crc kubenswrapper[4753]: I1005 21:37:14.274929 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-reloader/0.log" Oct 05 21:37:14 crc kubenswrapper[4753]: I1005 21:37:14.295570 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-frr-files/0.log" Oct 05 21:37:14 crc kubenswrapper[4753]: I1005 21:37:14.323963 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/cp-metrics/0.log" Oct 05 21:37:14 crc kubenswrapper[4753]: I1005 21:37:14.400968 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/controller/0.log" Oct 05 21:37:14 crc kubenswrapper[4753]: I1005 21:37:14.999682 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/frr-metrics/0.log" Oct 05 21:37:15 crc kubenswrapper[4753]: I1005 21:37:15.036906 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/kube-rbac-proxy-frr/0.log" Oct 05 21:37:15 crc kubenswrapper[4753]: I1005 21:37:15.089092 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/kube-rbac-proxy/0.log" Oct 05 21:37:15 crc kubenswrapper[4753]: I1005 21:37:15.340332 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-b4sqj_acdc2b18-6e82-4c5b-964f-b708c56c3704/frr-k8s-webhook-server/0.log" Oct 05 21:37:15 crc kubenswrapper[4753]: I1005 21:37:15.422385 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/reloader/0.log" Oct 05 21:37:15 crc kubenswrapper[4753]: I1005 21:37:15.933141 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-76575689f9-tr955_e9003581-3277-433c-9c49-5a186f493cc5/manager/0.log" Oct 05 21:37:16 crc kubenswrapper[4753]: I1005 21:37:16.283425 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gvxbk_e8a0b57b-6eb5-44de-9ee4-1b8f9a967a54/frr/0.log" Oct 05 21:37:16 crc kubenswrapper[4753]: I1005 21:37:16.283478 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z5l8d_bb160ebf-df7f-4e27-b5f5-0e108e377e5d/kube-rbac-proxy/0.log" Oct 05 21:37:16 crc kubenswrapper[4753]: I1005 21:37:16.310867 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-78c6d655f5-9pcgt_2849418e-7428-46b6-89b0-fc001cb09db2/webhook-server/0.log" Oct 05 21:37:16 crc kubenswrapper[4753]: I1005 21:37:16.696800 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z5l8d_bb160ebf-df7f-4e27-b5f5-0e108e377e5d/speaker/0.log" Oct 05 21:37:30 crc kubenswrapper[4753]: I1005 21:37:30.229688 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88_e668534c-ad83-4c7d-9270-bffa57782d91/util/0.log" Oct 05 21:37:30 crc kubenswrapper[4753]: I1005 21:37:30.428840 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88_e668534c-ad83-4c7d-9270-bffa57782d91/util/0.log" Oct 05 21:37:30 crc kubenswrapper[4753]: I1005 21:37:30.474685 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88_e668534c-ad83-4c7d-9270-bffa57782d91/pull/0.log" Oct 05 21:37:30 crc kubenswrapper[4753]: I1005 21:37:30.504956 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88_e668534c-ad83-4c7d-9270-bffa57782d91/pull/0.log" Oct 05 21:37:30 crc kubenswrapper[4753]: I1005 21:37:30.663942 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88_e668534c-ad83-4c7d-9270-bffa57782d91/util/0.log" Oct 05 21:37:30 crc kubenswrapper[4753]: I1005 21:37:30.702929 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88_e668534c-ad83-4c7d-9270-bffa57782d91/extract/0.log" Oct 05 21:37:30 crc kubenswrapper[4753]: I1005 21:37:30.726749 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2mvg88_e668534c-ad83-4c7d-9270-bffa57782d91/pull/0.log" Oct 05 21:37:30 crc kubenswrapper[4753]: I1005 21:37:30.866701 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gdpm6_a4bd315d-8dd0-4844-a675-ddc48827669d/extract-utilities/0.log" Oct 05 21:37:31 crc kubenswrapper[4753]: I1005 21:37:31.525222 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gdpm6_a4bd315d-8dd0-4844-a675-ddc48827669d/extract-utilities/0.log" Oct 05 21:37:31 crc kubenswrapper[4753]: I1005 21:37:31.587713 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gdpm6_a4bd315d-8dd0-4844-a675-ddc48827669d/extract-content/0.log" Oct 05 21:37:31 crc kubenswrapper[4753]: I1005 21:37:31.608121 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gdpm6_a4bd315d-8dd0-4844-a675-ddc48827669d/extract-content/0.log" Oct 05 21:37:31 crc kubenswrapper[4753]: I1005 21:37:31.887839 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gdpm6_a4bd315d-8dd0-4844-a675-ddc48827669d/extract-utilities/0.log" Oct 05 21:37:32 crc kubenswrapper[4753]: I1005 21:37:32.062555 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gdpm6_a4bd315d-8dd0-4844-a675-ddc48827669d/extract-content/0.log" Oct 05 21:37:32 crc kubenswrapper[4753]: I1005 21:37:32.146511 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jq98s_09240459-8b63-4037-b2d7-0f3a2e294835/extract-utilities/0.log" Oct 05 21:37:32 crc kubenswrapper[4753]: I1005 21:37:32.407477 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gdpm6_a4bd315d-8dd0-4844-a675-ddc48827669d/registry-server/0.log" Oct 05 21:37:32 crc kubenswrapper[4753]: I1005 21:37:32.555463 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jq98s_09240459-8b63-4037-b2d7-0f3a2e294835/extract-content/0.log" Oct 05 21:37:32 crc kubenswrapper[4753]: I1005 21:37:32.570578 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jq98s_09240459-8b63-4037-b2d7-0f3a2e294835/extract-content/0.log" Oct 05 21:37:32 crc kubenswrapper[4753]: I1005 21:37:32.608994 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jq98s_09240459-8b63-4037-b2d7-0f3a2e294835/extract-utilities/0.log" Oct 05 21:37:33 crc kubenswrapper[4753]: I1005 21:37:33.356457 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jq98s_09240459-8b63-4037-b2d7-0f3a2e294835/extract-utilities/0.log" Oct 05 21:37:33 crc kubenswrapper[4753]: I1005 21:37:33.442696 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892_2c210d16-5774-42c1-95d1-bba3c106fa44/util/0.log" Oct 05 21:37:33 crc kubenswrapper[4753]: I1005 21:37:33.477676 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jq98s_09240459-8b63-4037-b2d7-0f3a2e294835/extract-content/0.log" Oct 05 21:37:33 crc kubenswrapper[4753]: I1005 21:37:33.693528 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892_2c210d16-5774-42c1-95d1-bba3c106fa44/pull/0.log" Oct 05 21:37:33 crc kubenswrapper[4753]: I1005 21:37:33.801844 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892_2c210d16-5774-42c1-95d1-bba3c106fa44/util/0.log" Oct 05 21:37:33 crc kubenswrapper[4753]: I1005 21:37:33.803893 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892_2c210d16-5774-42c1-95d1-bba3c106fa44/pull/0.log" Oct 05 21:37:34 crc kubenswrapper[4753]: I1005 21:37:34.055494 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892_2c210d16-5774-42c1-95d1-bba3c106fa44/pull/0.log" Oct 05 21:37:34 crc kubenswrapper[4753]: I1005 21:37:34.056397 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jq98s_09240459-8b63-4037-b2d7-0f3a2e294835/registry-server/0.log" Oct 05 21:37:34 crc kubenswrapper[4753]: I1005 21:37:34.098389 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892_2c210d16-5774-42c1-95d1-bba3c106fa44/util/0.log" Oct 05 21:37:34 crc kubenswrapper[4753]: I1005 21:37:34.116183 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cr4892_2c210d16-5774-42c1-95d1-bba3c106fa44/extract/0.log" Oct 05 21:37:34 crc kubenswrapper[4753]: I1005 21:37:34.242720 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-98zw7_1c9573cf-5cd6-4c3b-8c62-0766f942629a/marketplace-operator/0.log" Oct 05 21:37:34 crc kubenswrapper[4753]: I1005 21:37:34.331550 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qnn2p_6b25c928-06b3-4bea-91cd-f1c609d1e785/extract-utilities/0.log" Oct 05 21:37:34 crc kubenswrapper[4753]: I1005 21:37:34.486556 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qnn2p_6b25c928-06b3-4bea-91cd-f1c609d1e785/extract-utilities/0.log" Oct 05 21:37:34 crc kubenswrapper[4753]: I1005 21:37:34.490291 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qnn2p_6b25c928-06b3-4bea-91cd-f1c609d1e785/extract-content/0.log" Oct 05 21:37:34 crc kubenswrapper[4753]: I1005 21:37:34.498170 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qnn2p_6b25c928-06b3-4bea-91cd-f1c609d1e785/extract-content/0.log" Oct 05 21:37:34 crc kubenswrapper[4753]: I1005 21:37:34.667060 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qnn2p_6b25c928-06b3-4bea-91cd-f1c609d1e785/extract-utilities/0.log" Oct 05 21:37:34 crc kubenswrapper[4753]: I1005 21:37:34.729881 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nhg75_7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8/extract-utilities/0.log" Oct 05 21:37:34 crc kubenswrapper[4753]: I1005 21:37:34.739370 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qnn2p_6b25c928-06b3-4bea-91cd-f1c609d1e785/extract-content/0.log" Oct 05 21:37:34 crc kubenswrapper[4753]: I1005 21:37:34.829463 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qnn2p_6b25c928-06b3-4bea-91cd-f1c609d1e785/registry-server/0.log" Oct 05 21:37:34 crc kubenswrapper[4753]: I1005 21:37:34.950990 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nhg75_7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8/extract-utilities/0.log" Oct 05 21:37:34 crc kubenswrapper[4753]: I1005 21:37:34.969648 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nhg75_7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8/extract-content/0.log" Oct 05 21:37:35 crc kubenswrapper[4753]: I1005 21:37:35.001078 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nhg75_7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8/extract-content/0.log" Oct 05 21:37:35 crc kubenswrapper[4753]: I1005 21:37:35.099913 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nhg75_7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8/extract-content/0.log" Oct 05 21:37:35 crc kubenswrapper[4753]: I1005 21:37:35.108872 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nhg75_7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8/extract-utilities/0.log" Oct 05 21:37:35 crc kubenswrapper[4753]: I1005 21:37:35.613174 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nhg75_7f31a46b-3ec3-42f3-94e4-a5fe09e0a2a8/registry-server/0.log" Oct 05 21:38:04 crc kubenswrapper[4753]: I1005 21:38:04.489713 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:38:04 crc kubenswrapper[4753]: I1005 21:38:04.490196 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:38:34 crc kubenswrapper[4753]: I1005 21:38:34.489697 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:38:34 crc kubenswrapper[4753]: I1005 21:38:34.490539 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:39:04 crc kubenswrapper[4753]: I1005 21:39:04.490034 4753 patch_prober.go:28] interesting pod/machine-config-daemon-xlrkd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 05 21:39:04 crc kubenswrapper[4753]: I1005 21:39:04.490410 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 05 21:39:04 crc kubenswrapper[4753]: I1005 21:39:04.490454 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" Oct 05 21:39:04 crc kubenswrapper[4753]: I1005 21:39:04.491130 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc"} pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 05 21:39:04 crc kubenswrapper[4753]: I1005 21:39:04.491203 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" containerName="machine-config-daemon" containerID="cri-o://596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc" gracePeriod=600 Oct 05 21:39:04 crc kubenswrapper[4753]: E1005 21:39:04.613386 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:39:05 crc kubenswrapper[4753]: I1005 21:39:05.048289 4753 generic.go:334] "Generic (PLEG): container finished" podID="a422d983-1769-4d79-9e71-b63bef552d37" containerID="596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc" exitCode=0 Oct 05 21:39:05 crc kubenswrapper[4753]: I1005 21:39:05.048340 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" event={"ID":"a422d983-1769-4d79-9e71-b63bef552d37","Type":"ContainerDied","Data":"596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc"} Oct 05 21:39:05 crc kubenswrapper[4753]: I1005 21:39:05.048377 4753 scope.go:117] "RemoveContainer" containerID="5905e260f85412f537154f743daa6fab4914430b1ac727576dca9ac43782f6d3" Oct 05 21:39:05 crc kubenswrapper[4753]: I1005 21:39:05.049374 4753 scope.go:117] "RemoveContainer" containerID="596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc" Oct 05 21:39:05 crc kubenswrapper[4753]: E1005 21:39:05.049699 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:39:06 crc kubenswrapper[4753]: I1005 21:39:06.783432 4753 scope.go:117] "RemoveContainer" containerID="7fcb6715123de32223e77ad91e9748317edc30f9a0290861151415bf209f5037" Oct 05 21:39:06 crc kubenswrapper[4753]: I1005 21:39:06.826772 4753 scope.go:117] "RemoveContainer" containerID="c502b26c3eed11c00c3aed1a9b8ebc5fe3154d3d32a45337ab46052f1cfc391c" Oct 05 21:39:06 crc kubenswrapper[4753]: I1005 21:39:06.872128 4753 scope.go:117] "RemoveContainer" containerID="e3e3d2fbf601436230be5a37ddb4f041ffba8fa1b772eeab78bcf6e154604c69" Oct 05 21:39:17 crc kubenswrapper[4753]: I1005 21:39:17.853450 4753 scope.go:117] "RemoveContainer" containerID="596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc" Oct 05 21:39:17 crc kubenswrapper[4753]: E1005 21:39:17.854850 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:39:32 crc kubenswrapper[4753]: I1005 21:39:32.852305 4753 scope.go:117] "RemoveContainer" containerID="596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc" Oct 05 21:39:32 crc kubenswrapper[4753]: E1005 21:39:32.853070 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:39:47 crc kubenswrapper[4753]: I1005 21:39:47.852880 4753 scope.go:117] "RemoveContainer" containerID="596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc" Oct 05 21:39:47 crc kubenswrapper[4753]: E1005 21:39:47.854602 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:39:58 crc kubenswrapper[4753]: I1005 21:39:58.851816 4753 scope.go:117] "RemoveContainer" containerID="596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc" Oct 05 21:39:58 crc kubenswrapper[4753]: E1005 21:39:58.853420 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:40:06 crc kubenswrapper[4753]: I1005 21:40:06.920433 4753 scope.go:117] "RemoveContainer" containerID="bfbd6f27c782f252c6ca2d0ed708bcc0e56d3656d3be02c6d06c915536356c51" Oct 05 21:40:09 crc kubenswrapper[4753]: I1005 21:40:09.853442 4753 scope.go:117] "RemoveContainer" containerID="596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc" Oct 05 21:40:09 crc kubenswrapper[4753]: E1005 21:40:09.854260 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:40:10 crc kubenswrapper[4753]: I1005 21:40:10.694009 4753 generic.go:334] "Generic (PLEG): container finished" podID="1d98a546-ae8b-432d-a788-79fcac33bcd3" containerID="5f42796b4154efedeaad017dc94f7c0f88a7e70e55236a22659942ad0bd8f3a8" exitCode=0 Oct 05 21:40:10 crc kubenswrapper[4753]: I1005 21:40:10.694058 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wr5ml/must-gather-rwvpx" event={"ID":"1d98a546-ae8b-432d-a788-79fcac33bcd3","Type":"ContainerDied","Data":"5f42796b4154efedeaad017dc94f7c0f88a7e70e55236a22659942ad0bd8f3a8"} Oct 05 21:40:10 crc kubenswrapper[4753]: I1005 21:40:10.695279 4753 scope.go:117] "RemoveContainer" containerID="5f42796b4154efedeaad017dc94f7c0f88a7e70e55236a22659942ad0bd8f3a8" Oct 05 21:40:11 crc kubenswrapper[4753]: I1005 21:40:11.129330 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wr5ml_must-gather-rwvpx_1d98a546-ae8b-432d-a788-79fcac33bcd3/gather/0.log" Oct 05 21:40:22 crc kubenswrapper[4753]: I1005 21:40:22.852981 4753 scope.go:117] "RemoveContainer" containerID="596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc" Oct 05 21:40:22 crc kubenswrapper[4753]: E1005 21:40:22.854217 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:40:26 crc kubenswrapper[4753]: I1005 21:40:26.525814 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wr5ml/must-gather-rwvpx"] Oct 05 21:40:26 crc kubenswrapper[4753]: I1005 21:40:26.526803 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-wr5ml/must-gather-rwvpx" podUID="1d98a546-ae8b-432d-a788-79fcac33bcd3" containerName="copy" containerID="cri-o://7a9f95472aba224d1d97ff0a74fdda4913be2ac1a56bbb71f9f2a264af3a9cbe" gracePeriod=2 Oct 05 21:40:26 crc kubenswrapper[4753]: I1005 21:40:26.558056 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wr5ml/must-gather-rwvpx"] Oct 05 21:40:26 crc kubenswrapper[4753]: I1005 21:40:26.876715 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wr5ml_must-gather-rwvpx_1d98a546-ae8b-432d-a788-79fcac33bcd3/copy/0.log" Oct 05 21:40:26 crc kubenswrapper[4753]: I1005 21:40:26.877157 4753 generic.go:334] "Generic (PLEG): container finished" podID="1d98a546-ae8b-432d-a788-79fcac33bcd3" containerID="7a9f95472aba224d1d97ff0a74fdda4913be2ac1a56bbb71f9f2a264af3a9cbe" exitCode=143 Oct 05 21:40:27 crc kubenswrapper[4753]: I1005 21:40:27.325698 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wr5ml_must-gather-rwvpx_1d98a546-ae8b-432d-a788-79fcac33bcd3/copy/0.log" Oct 05 21:40:27 crc kubenswrapper[4753]: I1005 21:40:27.326613 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr5ml/must-gather-rwvpx" Oct 05 21:40:27 crc kubenswrapper[4753]: I1005 21:40:27.444375 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d98a546-ae8b-432d-a788-79fcac33bcd3-must-gather-output\") pod \"1d98a546-ae8b-432d-a788-79fcac33bcd3\" (UID: \"1d98a546-ae8b-432d-a788-79fcac33bcd3\") " Oct 05 21:40:27 crc kubenswrapper[4753]: I1005 21:40:27.444537 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25flr\" (UniqueName: \"kubernetes.io/projected/1d98a546-ae8b-432d-a788-79fcac33bcd3-kube-api-access-25flr\") pod \"1d98a546-ae8b-432d-a788-79fcac33bcd3\" (UID: \"1d98a546-ae8b-432d-a788-79fcac33bcd3\") " Oct 05 21:40:27 crc kubenswrapper[4753]: I1005 21:40:27.487070 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d98a546-ae8b-432d-a788-79fcac33bcd3-kube-api-access-25flr" (OuterVolumeSpecName: "kube-api-access-25flr") pod "1d98a546-ae8b-432d-a788-79fcac33bcd3" (UID: "1d98a546-ae8b-432d-a788-79fcac33bcd3"). InnerVolumeSpecName "kube-api-access-25flr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:40:27 crc kubenswrapper[4753]: I1005 21:40:27.548446 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25flr\" (UniqueName: \"kubernetes.io/projected/1d98a546-ae8b-432d-a788-79fcac33bcd3-kube-api-access-25flr\") on node \"crc\" DevicePath \"\"" Oct 05 21:40:27 crc kubenswrapper[4753]: I1005 21:40:27.679934 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d98a546-ae8b-432d-a788-79fcac33bcd3-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1d98a546-ae8b-432d-a788-79fcac33bcd3" (UID: "1d98a546-ae8b-432d-a788-79fcac33bcd3"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:40:27 crc kubenswrapper[4753]: I1005 21:40:27.751585 4753 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d98a546-ae8b-432d-a788-79fcac33bcd3-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 05 21:40:27 crc kubenswrapper[4753]: I1005 21:40:27.862495 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d98a546-ae8b-432d-a788-79fcac33bcd3" path="/var/lib/kubelet/pods/1d98a546-ae8b-432d-a788-79fcac33bcd3/volumes" Oct 05 21:40:27 crc kubenswrapper[4753]: I1005 21:40:27.886937 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wr5ml_must-gather-rwvpx_1d98a546-ae8b-432d-a788-79fcac33bcd3/copy/0.log" Oct 05 21:40:27 crc kubenswrapper[4753]: I1005 21:40:27.887355 4753 scope.go:117] "RemoveContainer" containerID="7a9f95472aba224d1d97ff0a74fdda4913be2ac1a56bbb71f9f2a264af3a9cbe" Oct 05 21:40:27 crc kubenswrapper[4753]: I1005 21:40:27.887482 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wr5ml/must-gather-rwvpx" Oct 05 21:40:27 crc kubenswrapper[4753]: I1005 21:40:27.913839 4753 scope.go:117] "RemoveContainer" containerID="5f42796b4154efedeaad017dc94f7c0f88a7e70e55236a22659942ad0bd8f3a8" Oct 05 21:40:33 crc kubenswrapper[4753]: I1005 21:40:33.851858 4753 scope.go:117] "RemoveContainer" containerID="596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc" Oct 05 21:40:33 crc kubenswrapper[4753]: E1005 21:40:33.852640 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:40:48 crc kubenswrapper[4753]: I1005 21:40:48.852271 4753 scope.go:117] "RemoveContainer" containerID="596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc" Oct 05 21:40:48 crc kubenswrapper[4753]: E1005 21:40:48.853558 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:41:01 crc kubenswrapper[4753]: I1005 21:41:01.857807 4753 scope.go:117] "RemoveContainer" containerID="596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc" Oct 05 21:41:01 crc kubenswrapper[4753]: E1005 21:41:01.858570 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:41:12 crc kubenswrapper[4753]: I1005 21:41:12.852812 4753 scope.go:117] "RemoveContainer" containerID="596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc" Oct 05 21:41:12 crc kubenswrapper[4753]: E1005 21:41:12.853628 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:41:23 crc kubenswrapper[4753]: I1005 21:41:23.852000 4753 scope.go:117] "RemoveContainer" containerID="596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc" Oct 05 21:41:23 crc kubenswrapper[4753]: E1005 21:41:23.853439 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.373381 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5tnb5"] Oct 05 21:41:32 crc kubenswrapper[4753]: E1005 21:41:32.374307 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d98a546-ae8b-432d-a788-79fcac33bcd3" containerName="gather" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.374323 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d98a546-ae8b-432d-a788-79fcac33bcd3" containerName="gather" Oct 05 21:41:32 crc kubenswrapper[4753]: E1005 21:41:32.374337 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99efa172-6d82-4942-b063-52dbdc3645a4" containerName="container-00" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.374345 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="99efa172-6d82-4942-b063-52dbdc3645a4" containerName="container-00" Oct 05 21:41:32 crc kubenswrapper[4753]: E1005 21:41:32.374375 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d98a546-ae8b-432d-a788-79fcac33bcd3" containerName="copy" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.374384 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d98a546-ae8b-432d-a788-79fcac33bcd3" containerName="copy" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.374605 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d98a546-ae8b-432d-a788-79fcac33bcd3" containerName="copy" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.374640 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="99efa172-6d82-4942-b063-52dbdc3645a4" containerName="container-00" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.374651 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d98a546-ae8b-432d-a788-79fcac33bcd3" containerName="gather" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.377471 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.399422 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5tnb5"] Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.437005 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4nls\" (UniqueName: \"kubernetes.io/projected/90957ddb-ce88-4431-9ab1-2d9d2ee88202-kube-api-access-x4nls\") pod \"community-operators-5tnb5\" (UID: \"90957ddb-ce88-4431-9ab1-2d9d2ee88202\") " pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.437093 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90957ddb-ce88-4431-9ab1-2d9d2ee88202-catalog-content\") pod \"community-operators-5tnb5\" (UID: \"90957ddb-ce88-4431-9ab1-2d9d2ee88202\") " pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.437135 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90957ddb-ce88-4431-9ab1-2d9d2ee88202-utilities\") pod \"community-operators-5tnb5\" (UID: \"90957ddb-ce88-4431-9ab1-2d9d2ee88202\") " pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.538379 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4nls\" (UniqueName: \"kubernetes.io/projected/90957ddb-ce88-4431-9ab1-2d9d2ee88202-kube-api-access-x4nls\") pod \"community-operators-5tnb5\" (UID: \"90957ddb-ce88-4431-9ab1-2d9d2ee88202\") " pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.538483 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90957ddb-ce88-4431-9ab1-2d9d2ee88202-catalog-content\") pod \"community-operators-5tnb5\" (UID: \"90957ddb-ce88-4431-9ab1-2d9d2ee88202\") " pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.538526 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90957ddb-ce88-4431-9ab1-2d9d2ee88202-utilities\") pod \"community-operators-5tnb5\" (UID: \"90957ddb-ce88-4431-9ab1-2d9d2ee88202\") " pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.539049 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90957ddb-ce88-4431-9ab1-2d9d2ee88202-utilities\") pod \"community-operators-5tnb5\" (UID: \"90957ddb-ce88-4431-9ab1-2d9d2ee88202\") " pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.539064 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90957ddb-ce88-4431-9ab1-2d9d2ee88202-catalog-content\") pod \"community-operators-5tnb5\" (UID: \"90957ddb-ce88-4431-9ab1-2d9d2ee88202\") " pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.569045 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4nls\" (UniqueName: \"kubernetes.io/projected/90957ddb-ce88-4431-9ab1-2d9d2ee88202-kube-api-access-x4nls\") pod \"community-operators-5tnb5\" (UID: \"90957ddb-ce88-4431-9ab1-2d9d2ee88202\") " pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:32 crc kubenswrapper[4753]: I1005 21:41:32.711370 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:33 crc kubenswrapper[4753]: I1005 21:41:33.281708 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5tnb5"] Oct 05 21:41:33 crc kubenswrapper[4753]: W1005 21:41:33.299668 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90957ddb_ce88_4431_9ab1_2d9d2ee88202.slice/crio-76db4fddf786908d228afc2db70a11b7cae887177c81648d4d24968e0d34ad84 WatchSource:0}: Error finding container 76db4fddf786908d228afc2db70a11b7cae887177c81648d4d24968e0d34ad84: Status 404 returned error can't find the container with id 76db4fddf786908d228afc2db70a11b7cae887177c81648d4d24968e0d34ad84 Oct 05 21:41:33 crc kubenswrapper[4753]: I1005 21:41:33.508307 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tnb5" event={"ID":"90957ddb-ce88-4431-9ab1-2d9d2ee88202","Type":"ContainerStarted","Data":"76db4fddf786908d228afc2db70a11b7cae887177c81648d4d24968e0d34ad84"} Oct 05 21:41:34 crc kubenswrapper[4753]: I1005 21:41:34.521559 4753 generic.go:334] "Generic (PLEG): container finished" podID="90957ddb-ce88-4431-9ab1-2d9d2ee88202" containerID="6c4df8928f5b9fb4b00837438f603bac4c1c9d444276659ba257d06b9a96776e" exitCode=0 Oct 05 21:41:34 crc kubenswrapper[4753]: I1005 21:41:34.521680 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tnb5" event={"ID":"90957ddb-ce88-4431-9ab1-2d9d2ee88202","Type":"ContainerDied","Data":"6c4df8928f5b9fb4b00837438f603bac4c1c9d444276659ba257d06b9a96776e"} Oct 05 21:41:34 crc kubenswrapper[4753]: I1005 21:41:34.527245 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 05 21:41:36 crc kubenswrapper[4753]: I1005 21:41:36.540385 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tnb5" event={"ID":"90957ddb-ce88-4431-9ab1-2d9d2ee88202","Type":"ContainerStarted","Data":"a0819c2c00a348d3dd9a49ebab4bea241f1471c0e07341786c33a7a59b8364b2"} Oct 05 21:41:37 crc kubenswrapper[4753]: I1005 21:41:37.550501 4753 generic.go:334] "Generic (PLEG): container finished" podID="90957ddb-ce88-4431-9ab1-2d9d2ee88202" containerID="a0819c2c00a348d3dd9a49ebab4bea241f1471c0e07341786c33a7a59b8364b2" exitCode=0 Oct 05 21:41:37 crc kubenswrapper[4753]: I1005 21:41:37.550760 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tnb5" event={"ID":"90957ddb-ce88-4431-9ab1-2d9d2ee88202","Type":"ContainerDied","Data":"a0819c2c00a348d3dd9a49ebab4bea241f1471c0e07341786c33a7a59b8364b2"} Oct 05 21:41:37 crc kubenswrapper[4753]: I1005 21:41:37.852820 4753 scope.go:117] "RemoveContainer" containerID="596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc" Oct 05 21:41:37 crc kubenswrapper[4753]: E1005 21:41:37.853306 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37" Oct 05 21:41:38 crc kubenswrapper[4753]: I1005 21:41:38.563210 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tnb5" event={"ID":"90957ddb-ce88-4431-9ab1-2d9d2ee88202","Type":"ContainerStarted","Data":"28914fca83e8a8025771f1395d9802f7cffccf4419c16e4e1bf9fbcb77ba4e2f"} Oct 05 21:41:38 crc kubenswrapper[4753]: I1005 21:41:38.588977 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5tnb5" podStartSLOduration=3.136553768 podStartE2EDuration="6.588950667s" podCreationTimestamp="2025-10-05 21:41:32 +0000 UTC" firstStartedPulling="2025-10-05 21:41:34.52584426 +0000 UTC m=+5203.374172502" lastFinishedPulling="2025-10-05 21:41:37.978241149 +0000 UTC m=+5206.826569401" observedRunningTime="2025-10-05 21:41:38.582938682 +0000 UTC m=+5207.431266914" watchObservedRunningTime="2025-10-05 21:41:38.588950667 +0000 UTC m=+5207.437278929" Oct 05 21:41:42 crc kubenswrapper[4753]: I1005 21:41:42.711789 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:42 crc kubenswrapper[4753]: I1005 21:41:42.712602 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:42 crc kubenswrapper[4753]: I1005 21:41:42.806803 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:43 crc kubenswrapper[4753]: I1005 21:41:43.679648 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:43 crc kubenswrapper[4753]: I1005 21:41:43.746928 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5tnb5"] Oct 05 21:41:45 crc kubenswrapper[4753]: I1005 21:41:45.640470 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5tnb5" podUID="90957ddb-ce88-4431-9ab1-2d9d2ee88202" containerName="registry-server" containerID="cri-o://28914fca83e8a8025771f1395d9802f7cffccf4419c16e4e1bf9fbcb77ba4e2f" gracePeriod=2 Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.122004 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.213897 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90957ddb-ce88-4431-9ab1-2d9d2ee88202-utilities\") pod \"90957ddb-ce88-4431-9ab1-2d9d2ee88202\" (UID: \"90957ddb-ce88-4431-9ab1-2d9d2ee88202\") " Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.213992 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90957ddb-ce88-4431-9ab1-2d9d2ee88202-catalog-content\") pod \"90957ddb-ce88-4431-9ab1-2d9d2ee88202\" (UID: \"90957ddb-ce88-4431-9ab1-2d9d2ee88202\") " Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.214038 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4nls\" (UniqueName: \"kubernetes.io/projected/90957ddb-ce88-4431-9ab1-2d9d2ee88202-kube-api-access-x4nls\") pod \"90957ddb-ce88-4431-9ab1-2d9d2ee88202\" (UID: \"90957ddb-ce88-4431-9ab1-2d9d2ee88202\") " Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.214859 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90957ddb-ce88-4431-9ab1-2d9d2ee88202-utilities" (OuterVolumeSpecName: "utilities") pod "90957ddb-ce88-4431-9ab1-2d9d2ee88202" (UID: "90957ddb-ce88-4431-9ab1-2d9d2ee88202"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.219959 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90957ddb-ce88-4431-9ab1-2d9d2ee88202-kube-api-access-x4nls" (OuterVolumeSpecName: "kube-api-access-x4nls") pod "90957ddb-ce88-4431-9ab1-2d9d2ee88202" (UID: "90957ddb-ce88-4431-9ab1-2d9d2ee88202"). InnerVolumeSpecName "kube-api-access-x4nls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.316919 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90957ddb-ce88-4431-9ab1-2d9d2ee88202-utilities\") on node \"crc\" DevicePath \"\"" Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.316963 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4nls\" (UniqueName: \"kubernetes.io/projected/90957ddb-ce88-4431-9ab1-2d9d2ee88202-kube-api-access-x4nls\") on node \"crc\" DevicePath \"\"" Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.649771 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90957ddb-ce88-4431-9ab1-2d9d2ee88202-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90957ddb-ce88-4431-9ab1-2d9d2ee88202" (UID: "90957ddb-ce88-4431-9ab1-2d9d2ee88202"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.657091 4753 generic.go:334] "Generic (PLEG): container finished" podID="90957ddb-ce88-4431-9ab1-2d9d2ee88202" containerID="28914fca83e8a8025771f1395d9802f7cffccf4419c16e4e1bf9fbcb77ba4e2f" exitCode=0 Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.657194 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tnb5" event={"ID":"90957ddb-ce88-4431-9ab1-2d9d2ee88202","Type":"ContainerDied","Data":"28914fca83e8a8025771f1395d9802f7cffccf4419c16e4e1bf9fbcb77ba4e2f"} Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.657251 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tnb5" event={"ID":"90957ddb-ce88-4431-9ab1-2d9d2ee88202","Type":"ContainerDied","Data":"76db4fddf786908d228afc2db70a11b7cae887177c81648d4d24968e0d34ad84"} Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.657284 4753 scope.go:117] "RemoveContainer" containerID="28914fca83e8a8025771f1395d9802f7cffccf4419c16e4e1bf9fbcb77ba4e2f" Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.657456 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tnb5" Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.704076 4753 scope.go:117] "RemoveContainer" containerID="a0819c2c00a348d3dd9a49ebab4bea241f1471c0e07341786c33a7a59b8364b2" Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.714035 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5tnb5"] Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.722879 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90957ddb-ce88-4431-9ab1-2d9d2ee88202-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.743830 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5tnb5"] Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.745087 4753 scope.go:117] "RemoveContainer" containerID="6c4df8928f5b9fb4b00837438f603bac4c1c9d444276659ba257d06b9a96776e" Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.778922 4753 scope.go:117] "RemoveContainer" containerID="28914fca83e8a8025771f1395d9802f7cffccf4419c16e4e1bf9fbcb77ba4e2f" Oct 05 21:41:46 crc kubenswrapper[4753]: E1005 21:41:46.779639 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28914fca83e8a8025771f1395d9802f7cffccf4419c16e4e1bf9fbcb77ba4e2f\": container with ID starting with 28914fca83e8a8025771f1395d9802f7cffccf4419c16e4e1bf9fbcb77ba4e2f not found: ID does not exist" containerID="28914fca83e8a8025771f1395d9802f7cffccf4419c16e4e1bf9fbcb77ba4e2f" Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.779687 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28914fca83e8a8025771f1395d9802f7cffccf4419c16e4e1bf9fbcb77ba4e2f"} err="failed to get container status \"28914fca83e8a8025771f1395d9802f7cffccf4419c16e4e1bf9fbcb77ba4e2f\": rpc error: code = NotFound desc = could not find container \"28914fca83e8a8025771f1395d9802f7cffccf4419c16e4e1bf9fbcb77ba4e2f\": container with ID starting with 28914fca83e8a8025771f1395d9802f7cffccf4419c16e4e1bf9fbcb77ba4e2f not found: ID does not exist" Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.779719 4753 scope.go:117] "RemoveContainer" containerID="a0819c2c00a348d3dd9a49ebab4bea241f1471c0e07341786c33a7a59b8364b2" Oct 05 21:41:46 crc kubenswrapper[4753]: E1005 21:41:46.780214 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0819c2c00a348d3dd9a49ebab4bea241f1471c0e07341786c33a7a59b8364b2\": container with ID starting with a0819c2c00a348d3dd9a49ebab4bea241f1471c0e07341786c33a7a59b8364b2 not found: ID does not exist" containerID="a0819c2c00a348d3dd9a49ebab4bea241f1471c0e07341786c33a7a59b8364b2" Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.780253 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0819c2c00a348d3dd9a49ebab4bea241f1471c0e07341786c33a7a59b8364b2"} err="failed to get container status \"a0819c2c00a348d3dd9a49ebab4bea241f1471c0e07341786c33a7a59b8364b2\": rpc error: code = NotFound desc = could not find container \"a0819c2c00a348d3dd9a49ebab4bea241f1471c0e07341786c33a7a59b8364b2\": container with ID starting with a0819c2c00a348d3dd9a49ebab4bea241f1471c0e07341786c33a7a59b8364b2 not found: ID does not exist" Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.780279 4753 scope.go:117] "RemoveContainer" containerID="6c4df8928f5b9fb4b00837438f603bac4c1c9d444276659ba257d06b9a96776e" Oct 05 21:41:46 crc kubenswrapper[4753]: E1005 21:41:46.780782 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c4df8928f5b9fb4b00837438f603bac4c1c9d444276659ba257d06b9a96776e\": container with ID starting with 6c4df8928f5b9fb4b00837438f603bac4c1c9d444276659ba257d06b9a96776e not found: ID does not exist" containerID="6c4df8928f5b9fb4b00837438f603bac4c1c9d444276659ba257d06b9a96776e" Oct 05 21:41:46 crc kubenswrapper[4753]: I1005 21:41:46.780819 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c4df8928f5b9fb4b00837438f603bac4c1c9d444276659ba257d06b9a96776e"} err="failed to get container status \"6c4df8928f5b9fb4b00837438f603bac4c1c9d444276659ba257d06b9a96776e\": rpc error: code = NotFound desc = could not find container \"6c4df8928f5b9fb4b00837438f603bac4c1c9d444276659ba257d06b9a96776e\": container with ID starting with 6c4df8928f5b9fb4b00837438f603bac4c1c9d444276659ba257d06b9a96776e not found: ID does not exist" Oct 05 21:41:47 crc kubenswrapper[4753]: I1005 21:41:47.866537 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90957ddb-ce88-4431-9ab1-2d9d2ee88202" path="/var/lib/kubelet/pods/90957ddb-ce88-4431-9ab1-2d9d2ee88202/volumes" Oct 05 21:41:49 crc kubenswrapper[4753]: I1005 21:41:49.852705 4753 scope.go:117] "RemoveContainer" containerID="596c87a823109ea3a88d2e87f4b84340416dbe39eaa970383487b247f0ec78dc" Oct 05 21:41:49 crc kubenswrapper[4753]: E1005 21:41:49.854398 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xlrkd_openshift-machine-config-operator(a422d983-1769-4d79-9e71-b63bef552d37)\"" pod="openshift-machine-config-operator/machine-config-daemon-xlrkd" podUID="a422d983-1769-4d79-9e71-b63bef552d37"